Dec 03 22:05:10 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 22:05:10 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 22:05:10 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 22:05:11 crc kubenswrapper[4830]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.138765 4830 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141531 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141545 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141550 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141554 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141558 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141563 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141566 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141570 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141574 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141577 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141581 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141586 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141594 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141598 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141602 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141607 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141611 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141614 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141618 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141622 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141626 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141629 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141633 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141637 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141641 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141645 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141649 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141652 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141656 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141660 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141665 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141670 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141675 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141681 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141685 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141689 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141693 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141696 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141701 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141705 4830 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141709 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141714 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141718 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141722 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141726 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141729 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141733 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141737 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141740 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141743 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141748 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141752 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141755 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141759 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141762 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141766 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141769 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141772 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141776 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141779 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141783 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141786 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141790 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141793 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141797 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141800 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141804 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141807 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141811 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141814 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.141818 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142082 4830 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142092 4830 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142099 4830 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142105 4830 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142110 4830 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142114 4830 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142120 4830 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142124 4830 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142129 4830 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142133 4830 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142138 4830 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142142 4830 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142146 4830 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142150 4830 flags.go:64] FLAG: --cgroup-root="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142154 4830 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142158 4830 flags.go:64] FLAG: --client-ca-file="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142163 4830 flags.go:64] FLAG: --cloud-config="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142167 4830 flags.go:64] FLAG: --cloud-provider="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142171 4830 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142176 4830 flags.go:64] FLAG: --cluster-domain="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142180 4830 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142185 4830 flags.go:64] FLAG: --config-dir="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142190 4830 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142194 4830 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142200 4830 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142204 4830 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142208 4830 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142212 4830 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142216 4830 flags.go:64] FLAG: --contention-profiling="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142220 4830 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142224 4830 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142230 4830 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142234 4830 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142239 4830 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142243 4830 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142247 4830 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142251 4830 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142255 4830 flags.go:64] FLAG: --enable-server="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142259 4830 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142265 4830 flags.go:64] FLAG: --event-burst="100" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142269 4830 flags.go:64] FLAG: --event-qps="50" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142273 4830 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142277 4830 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142281 4830 flags.go:64] FLAG: --eviction-hard="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142286 4830 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142290 4830 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142294 4830 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142298 4830 flags.go:64] FLAG: --eviction-soft="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142302 4830 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142306 4830 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142310 4830 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142314 4830 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142318 4830 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142322 4830 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142326 4830 flags.go:64] FLAG: --feature-gates="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142331 4830 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142335 4830 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142340 4830 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142344 4830 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142348 4830 flags.go:64] FLAG: --healthz-port="10248" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142352 4830 flags.go:64] FLAG: --help="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142356 4830 flags.go:64] FLAG: --hostname-override="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142361 4830 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142365 4830 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142369 4830 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142373 4830 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142377 4830 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142381 4830 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142385 4830 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142389 4830 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142393 4830 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142397 4830 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142401 4830 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142405 4830 flags.go:64] FLAG: --kube-reserved="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142409 4830 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142413 4830 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142417 4830 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142421 4830 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142425 4830 flags.go:64] FLAG: --lock-file="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142429 4830 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142433 4830 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142437 4830 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142443 4830 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142447 4830 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142451 4830 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142455 4830 flags.go:64] FLAG: --logging-format="text" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142459 4830 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142463 4830 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142467 4830 flags.go:64] FLAG: --manifest-url="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142471 4830 flags.go:64] FLAG: --manifest-url-header="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142477 4830 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142481 4830 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142486 4830 flags.go:64] FLAG: --max-pods="110" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142490 4830 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142494 4830 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142499 4830 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142503 4830 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142521 4830 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142526 4830 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142530 4830 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142540 4830 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142543 4830 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142547 4830 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142551 4830 flags.go:64] FLAG: --pod-cidr="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142555 4830 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142565 4830 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142569 4830 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142573 4830 flags.go:64] FLAG: --pods-per-core="0" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142577 4830 flags.go:64] FLAG: --port="10250" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142581 4830 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142585 4830 flags.go:64] FLAG: --provider-id="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142590 4830 flags.go:64] FLAG: --qos-reserved="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142594 4830 flags.go:64] FLAG: --read-only-port="10255" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142598 4830 flags.go:64] FLAG: --register-node="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142602 4830 flags.go:64] FLAG: --register-schedulable="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142606 4830 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142617 4830 flags.go:64] FLAG: --registry-burst="10" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142621 4830 flags.go:64] FLAG: --registry-qps="5" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142625 4830 flags.go:64] FLAG: --reserved-cpus="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142629 4830 flags.go:64] FLAG: --reserved-memory="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142634 4830 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142638 4830 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142642 4830 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142646 4830 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142650 4830 flags.go:64] FLAG: --runonce="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142654 4830 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142658 4830 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142663 4830 flags.go:64] FLAG: --seccomp-default="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142667 4830 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142671 4830 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142676 4830 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142680 4830 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142684 4830 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142689 4830 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142692 4830 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142697 4830 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142700 4830 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142705 4830 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142709 4830 flags.go:64] FLAG: --system-cgroups="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142713 4830 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142719 4830 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142723 4830 flags.go:64] FLAG: --tls-cert-file="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142727 4830 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142732 4830 flags.go:64] FLAG: --tls-min-version="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142736 4830 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142740 4830 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142744 4830 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142748 4830 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142752 4830 flags.go:64] FLAG: --v="2" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142757 4830 flags.go:64] FLAG: --version="false" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142762 4830 flags.go:64] FLAG: --vmodule="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142767 4830 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.142772 4830 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.144950 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.144998 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145007 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145016 4830 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145029 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145041 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145053 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145063 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145072 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145081 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145090 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145099 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145108 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145115 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145126 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145134 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145143 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145151 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145159 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145167 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145174 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145182 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145190 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145200 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145208 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145215 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145223 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145231 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145241 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145249 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145256 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145265 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145273 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145281 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145289 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145297 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145305 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145313 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145351 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145359 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145367 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145374 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145382 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145390 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145398 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145405 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145416 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145426 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145435 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145444 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145454 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145462 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145471 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145479 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145488 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145497 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145505 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145548 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145560 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145570 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145593 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145601 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145609 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145617 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145624 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145633 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145641 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145649 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145658 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145665 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.145673 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.145687 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.156582 4830 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.156604 4830 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156660 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156665 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156669 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156674 4830 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156678 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156683 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156686 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156690 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156694 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156698 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156701 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156706 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156711 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156716 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156719 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156723 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156726 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156730 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156734 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156738 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156742 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156745 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156749 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156752 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156756 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156760 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156763 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156767 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156770 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156774 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156777 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156782 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156786 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156790 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156794 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156797 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156801 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156804 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156807 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156811 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156815 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156818 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156822 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156826 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156829 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156833 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156837 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156840 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156844 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156847 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156851 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156854 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156859 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156864 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156867 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156871 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156875 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156878 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156881 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156885 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156890 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156893 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156897 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156900 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156904 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156907 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156911 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156915 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156920 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156923 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.156927 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.156933 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157046 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157051 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157055 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157059 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157063 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157067 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157070 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157074 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157077 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157081 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157084 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157088 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157091 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157095 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157100 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157104 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157107 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157111 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157114 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157118 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157121 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157125 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157128 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157132 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157137 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157140 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157144 4830 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157147 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157151 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157155 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157158 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157162 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157166 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157170 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157175 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157179 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157183 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157187 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157190 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157194 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157198 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157202 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157205 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157209 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157213 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157216 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157220 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157223 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157227 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157230 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157234 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157237 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157241 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157245 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157249 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157252 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157257 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157262 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157265 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157269 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157274 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157279 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157283 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157287 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157292 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157295 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157300 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157304 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157307 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157311 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.157315 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.157322 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.157487 4830 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.160319 4830 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.160401 4830 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.161007 4830 server.go:997] "Starting client certificate rotation" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.161032 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.161232 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 04:50:01.290104441 +0000 UTC Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.161343 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 534h44m50.128766465s for next certificate rotation Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.167220 4830 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.169159 4830 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.177329 4830 log.go:25] "Validated CRI v1 runtime API" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.204022 4830 log.go:25] "Validated CRI v1 image API" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.206164 4830 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.210438 4830 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-22-00-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.210498 4830 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.244195 4830 manager.go:217] Machine: {Timestamp:2025-12-03 22:05:11.241731831 +0000 UTC m=+0.238193270 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:650ea5bb-184d-4066-8107-1bf795365c7c BootID:5096e846-2f08-4706-b180-cb04a3bb9612 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:98:3a:16 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:98:3a:16 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e6:67:e7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:95:45:1e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ef:7a:ff Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:92:b7:31 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c6:3b:c5:0b:d0:0c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9a:df:a2:da:7b:18 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.245231 4830 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.245658 4830 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.246244 4830 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.246618 4830 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.246697 4830 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.247107 4830 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.247130 4830 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.247571 4830 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.247650 4830 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.248168 4830 state_mem.go:36] "Initialized new in-memory state store" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.248874 4830 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.249915 4830 kubelet.go:418] "Attempting to sync node with API server" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.249964 4830 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.250015 4830 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.250041 4830 kubelet.go:324] "Adding apiserver pod source" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.250065 4830 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.252415 4830 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.253034 4830 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.253168 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.253251 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.253410 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.253648 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.254297 4830 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255059 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255103 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255118 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255132 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255154 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255169 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255183 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255206 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255222 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255237 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255257 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255271 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.255573 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.256284 4830 server.go:1280] "Started kubelet" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.256573 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.256747 4830 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.256952 4830 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.258349 4830 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 22:05:11 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.258912 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.217:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dd3ce2035e4f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 22:05:11.256229111 +0000 UTC m=+0.252690490,LastTimestamp:2025-12-03 22:05:11.256229111 +0000 UTC m=+0.252690490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.260603 4830 server.go:460] "Adding debug handlers to kubelet server" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.260681 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.260726 4830 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.260860 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 19:21:49.70935169 +0000 UTC Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.260932 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 837h16m38.448426044s for next certificate rotation Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.261046 4830 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.261070 4830 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.262370 4830 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.263669 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.263825 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.263979 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.268399 4830 factory.go:55] Registering systemd factory Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.268451 4830 factory.go:221] Registration of the systemd container factory successfully Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.270484 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="200ms" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.274647 4830 factory.go:153] Registering CRI-O factory Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.274698 4830 factory.go:221] Registration of the crio container factory successfully Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.274818 4830 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.274851 4830 factory.go:103] Registering Raw factory Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.274881 4830 manager.go:1196] Started watching for new ooms in manager Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.276244 4830 manager.go:319] Starting recovery of all containers Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280387 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280485 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280628 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280664 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280693 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280718 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280743 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280775 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280804 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280828 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280853 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280879 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280905 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280934 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280959 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.280984 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281009 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281033 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281060 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281084 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281108 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281205 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281241 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281271 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281301 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281331 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281363 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281392 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281420 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281449 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281473 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281564 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281630 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281673 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281700 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281726 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281752 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281778 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281804 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281831 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281861 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281888 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281915 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281942 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281969 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.281994 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282023 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282048 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282074 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282101 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282126 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282151 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282200 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282270 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282304 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282335 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282362 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282388 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282412 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282436 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282461 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282486 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282541 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282568 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282592 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282617 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282641 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282666 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282690 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282714 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282737 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282763 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282789 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282813 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282839 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282866 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.282894 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284087 4830 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284143 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284171 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284194 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284221 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284245 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284268 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284293 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284317 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284343 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284368 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284395 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284419 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284447 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284472 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284498 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284566 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284597 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284621 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284646 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284672 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284697 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284721 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284748 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284774 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284848 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284877 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284903 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284942 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284970 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.284997 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285024 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285051 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285077 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285103 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285132 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285156 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285184 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285211 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285236 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285262 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285289 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285316 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285344 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285370 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285401 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285426 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285453 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285478 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285534 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285568 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285596 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285654 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285684 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285727 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285752 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285778 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285806 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285838 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285866 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285891 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285920 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285947 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.285975 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286003 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286030 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286055 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286079 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286103 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286127 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286154 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286181 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286205 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286232 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286260 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286289 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286317 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286344 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286369 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286397 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286425 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286449 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286474 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286501 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286570 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286611 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286641 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286717 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286748 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286776 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286803 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286828 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286856 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286883 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286908 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286932 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286956 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.286980 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287019 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287043 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287067 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287089 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287114 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287138 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287163 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287190 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287216 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287242 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287268 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287292 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287315 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287339 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287367 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287393 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287416 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287440 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287468 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287497 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287563 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287591 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287615 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287640 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287668 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287693 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287718 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287741 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287768 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287795 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287820 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287845 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287871 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287896 4830 reconstruct.go:97] "Volume reconstruction finished" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.287912 4830 reconciler.go:26] "Reconciler: start to sync state" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.316319 4830 manager.go:324] Recovery completed Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.327790 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.330861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.330917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.330939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.332305 4830 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.332348 4830 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.332381 4830 state_mem.go:36] "Initialized new in-memory state store" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.332921 4830 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.335605 4830 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.335662 4830 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.335699 4830 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.335763 4830 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.337273 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.337368 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.341224 4830 policy_none.go:49] "None policy: Start" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.342181 4830 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.342222 4830 state_mem.go:35] "Initializing new in-memory state store" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.364320 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408006 4830 manager.go:334] "Starting Device Plugin manager" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408072 4830 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408084 4830 server.go:79] "Starting device plugin registration server" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408785 4830 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408807 4830 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.408989 4830 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.409147 4830 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.409165 4830 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.421752 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.436811 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.436936 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438303 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438313 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438440 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438688 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.438754 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439348 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439361 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439548 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439673 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439711 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439937 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.439958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440343 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440423 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440620 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.440664 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441310 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441415 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.441459 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442256 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442298 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442565 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.442605 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.443607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.443669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.443683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.471599 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="400ms" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490714 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490759 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490900 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490931 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.490992 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491014 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491128 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491214 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.491253 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.509015 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.510473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.510533 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.510548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.510576 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.511076 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.217:6443: connect: connection refused" node="crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592561 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592601 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592661 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592699 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592720 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592784 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592803 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592818 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592889 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592731 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.592976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593037 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593076 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593327 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593431 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.593607 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.711652 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.713213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.713273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.713291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.713330 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.713963 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.217:6443: connect: connection refused" node="crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.765215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.773826 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.796199 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.804585 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6162a29bd061e0678916f1cadf210e839a2d4dcc61c01e9b27bac417b5610fae WatchSource:0}: Error finding container 6162a29bd061e0678916f1cadf210e839a2d4dcc61c01e9b27bac417b5610fae: Status 404 returned error can't find the container with id 6162a29bd061e0678916f1cadf210e839a2d4dcc61c01e9b27bac417b5610fae Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.806916 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.807568 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-124299498a4221909e323ac304327d32b3ad5abf7f333f5206d29f9b137c5bd1 WatchSource:0}: Error finding container 124299498a4221909e323ac304327d32b3ad5abf7f333f5206d29f9b137c5bd1: Status 404 returned error can't find the container with id 124299498a4221909e323ac304327d32b3ad5abf7f333f5206d29f9b137c5bd1 Dec 03 22:05:11 crc kubenswrapper[4830]: I1203 22:05:11.812218 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.828330 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3259fa339aa24944ef1c02026a4133a73238325e20a204b010f221e8518a7b97 WatchSource:0}: Error finding container 3259fa339aa24944ef1c02026a4133a73238325e20a204b010f221e8518a7b97: Status 404 returned error can't find the container with id 3259fa339aa24944ef1c02026a4133a73238325e20a204b010f221e8518a7b97 Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.831378 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-5840ac8a877a734189cb328fd9ec01fcdc3b13e2307ed26269f628ba1f7e1246 WatchSource:0}: Error finding container 5840ac8a877a734189cb328fd9ec01fcdc3b13e2307ed26269f628ba1f7e1246: Status 404 returned error can't find the container with id 5840ac8a877a734189cb328fd9ec01fcdc3b13e2307ed26269f628ba1f7e1246 Dec 03 22:05:11 crc kubenswrapper[4830]: W1203 22:05:11.835715 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3508cb3d138826d6f33500e7a2e444a6c70bdb6f402dac92cb7197b7e16df75e WatchSource:0}: Error finding container 3508cb3d138826d6f33500e7a2e444a6c70bdb6f402dac92cb7197b7e16df75e: Status 404 returned error can't find the container with id 3508cb3d138826d6f33500e7a2e444a6c70bdb6f402dac92cb7197b7e16df75e Dec 03 22:05:11 crc kubenswrapper[4830]: E1203 22:05:11.873241 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="800ms" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.114385 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.115995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.116046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.116063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.116099 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.116523 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.217:6443: connect: connection refused" node="crc" Dec 03 22:05:12 crc kubenswrapper[4830]: W1203 22:05:12.161279 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.161355 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:12 crc kubenswrapper[4830]: W1203 22:05:12.192980 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.193047 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.257155 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.343498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.343642 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5840ac8a877a734189cb328fd9ec01fcdc3b13e2307ed26269f628ba1f7e1246"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.345814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.345907 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3259fa339aa24944ef1c02026a4133a73238325e20a204b010f221e8518a7b97"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.348539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.348584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6162a29bd061e0678916f1cadf210e839a2d4dcc61c01e9b27bac417b5610fae"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.350958 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.351026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"124299498a4221909e323ac304327d32b3ad5abf7f333f5206d29f9b137c5bd1"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.353228 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb"} Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.353280 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3508cb3d138826d6f33500e7a2e444a6c70bdb6f402dac92cb7197b7e16df75e"} Dec 03 22:05:12 crc kubenswrapper[4830]: W1203 22:05:12.593571 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.593700 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:12 crc kubenswrapper[4830]: W1203 22:05:12.632047 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.632142 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.674348 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="1.6s" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.917576 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.919988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.920049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.920065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:12 crc kubenswrapper[4830]: I1203 22:05:12.920099 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:12 crc kubenswrapper[4830]: E1203 22:05:12.920620 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.217:6443: connect: connection refused" node="crc" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.258194 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.358209 4830 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb" exitCode=0 Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.358390 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.358386 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.359717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.359754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.359768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.362780 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.362830 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.362838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.362965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.364431 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.364476 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.364492 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.366442 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7" exitCode=0 Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.366571 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.366658 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.368016 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.368085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.368107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.368806 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e" exitCode=0 Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.368941 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.369004 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.370153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.370195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.370215 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.372647 4830 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b" exitCode=0 Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.372721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b"} Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.372761 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.374051 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.374093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.374110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.374673 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.376339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.376370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:13 crc kubenswrapper[4830]: I1203 22:05:13.376384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.257340 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:14 crc kubenswrapper[4830]: E1203 22:05:14.275365 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="3.2s" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.377911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909"} Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.380262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170"} Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.382529 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c" exitCode=0 Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.382587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c"} Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.382744 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.384086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.384124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.384143 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.385611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f"} Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.385636 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.385642 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387074 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387396 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.387420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.521664 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.523577 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.523646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.523672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.523744 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:14 crc kubenswrapper[4830]: E1203 22:05:14.524473 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.217:6443: connect: connection refused" node="crc" Dec 03 22:05:14 crc kubenswrapper[4830]: I1203 22:05:14.532442 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:14 crc kubenswrapper[4830]: W1203 22:05:14.762343 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:14 crc kubenswrapper[4830]: E1203 22:05:14.762469 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:14 crc kubenswrapper[4830]: W1203 22:05:14.863743 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:14 crc kubenswrapper[4830]: E1203 22:05:14.863834 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:14 crc kubenswrapper[4830]: W1203 22:05:14.889667 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.217:6443: connect: connection refused Dec 03 22:05:14 crc kubenswrapper[4830]: E1203 22:05:14.889745 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.217:6443: connect: connection refused" logger="UnhandledError" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.392169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275"} Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.392211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27"} Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.396958 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe" exitCode=0 Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.397028 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.397030 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe"} Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.397839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.397869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.397883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.399646 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c"} Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.399732 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2"} Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.399828 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.399840 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.399864 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402471 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402431 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.402606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.403592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.403903 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.404019 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:15 crc kubenswrapper[4830]: I1203 22:05:15.477068 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.408295 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6"} Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.408351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d"} Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.408383 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.409695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.409763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.409788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415448 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a"} Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e"} Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6"} Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415611 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415583 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.415750 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.417401 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.417448 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.417465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.418751 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.418807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:16 crc kubenswrapper[4830]: I1203 22:05:16.418827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.426314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17"} Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.426411 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.426411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235"} Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.426347 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.426645 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428319 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.428605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.533325 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.533471 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.590948 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.725385 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.727143 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.727206 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.727230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.727280 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.752378 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.752664 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.754931 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.755121 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.755305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.763451 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:17 crc kubenswrapper[4830]: I1203 22:05:17.843986 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.430610 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.430670 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.430759 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.431743 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432654 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.432673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.433225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.433299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:18 crc kubenswrapper[4830]: I1203 22:05:18.433337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.058619 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.433707 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.433743 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435760 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.435992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.443941 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.444078 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.445055 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.445116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.445134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.634930 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.635636 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.637640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.637705 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:19 crc kubenswrapper[4830]: I1203 22:05:19.637724 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:20 crc kubenswrapper[4830]: I1203 22:05:20.342198 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:20 crc kubenswrapper[4830]: I1203 22:05:20.437317 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:20 crc kubenswrapper[4830]: I1203 22:05:20.438821 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:20 crc kubenswrapper[4830]: I1203 22:05:20.439003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:20 crc kubenswrapper[4830]: I1203 22:05:20.439140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:21 crc kubenswrapper[4830]: E1203 22:05:21.421876 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.200965 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.201207 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.202540 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.202584 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.202601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.911984 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.912086 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.921368 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 22:05:24 crc kubenswrapper[4830]: I1203 22:05:24.921428 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 22:05:25 crc kubenswrapper[4830]: I1203 22:05:25.487427 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:25 crc kubenswrapper[4830]: I1203 22:05:25.487689 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:25 crc kubenswrapper[4830]: I1203 22:05:25.489045 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:25 crc kubenswrapper[4830]: I1203 22:05:25.489085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:25 crc kubenswrapper[4830]: I1203 22:05:25.489102 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.534573 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.534708 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.600761 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.601618 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.603204 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.603297 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.606398 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.606472 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.606495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:27 crc kubenswrapper[4830]: I1203 22:05:27.613712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.458392 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.458873 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.458959 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.459945 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.459988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:28 crc kubenswrapper[4830]: I1203 22:05:28.460001 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.060116 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.060321 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.919889 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 22:05:29 crc kubenswrapper[4830]: E1203 22:05:29.930360 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.931242 4830 trace.go:236] Trace[1548916857]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 22:05:14.957) (total time: 14973ms): Dec 03 22:05:29 crc kubenswrapper[4830]: Trace[1548916857]: ---"Objects listed" error: 14973ms (22:05:29.931) Dec 03 22:05:29 crc kubenswrapper[4830]: Trace[1548916857]: [14.973412611s] [14.973412611s] END Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.931285 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.932215 4830 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.932332 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.933751 4830 trace.go:236] Trace[1125544812]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 22:05:19.472) (total time: 10461ms): Dec 03 22:05:29 crc kubenswrapper[4830]: Trace[1125544812]: ---"Objects listed" error: 10461ms (22:05:29.933) Dec 03 22:05:29 crc kubenswrapper[4830]: Trace[1125544812]: [10.461505134s] [10.461505134s] END Dec 03 22:05:29 crc kubenswrapper[4830]: I1203 22:05:29.933793 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 22:05:29 crc kubenswrapper[4830]: E1203 22:05:29.938119 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.261916 4830 apiserver.go:52] "Watching apiserver" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.264011 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.264230 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.264610 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.264765 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.264820 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.264714 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.265014 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.265000 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.265029 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.265475 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.265656 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.266987 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.267032 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.267447 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.267484 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.267611 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.267628 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.268045 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.268192 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.269897 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.289265 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.303091 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.319114 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.333009 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.355871 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.363120 4830 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.388014 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.411459 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.434951 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435003 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435077 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435098 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435164 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435184 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435204 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435225 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435264 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435287 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435329 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435350 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435374 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435416 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435480 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435502 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435562 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435588 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435610 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435631 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435654 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435673 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435694 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435714 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435733 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435756 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435776 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435828 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435850 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435869 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435888 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435907 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435948 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435976 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.435999 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436022 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436046 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436068 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436112 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436134 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436183 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436210 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436233 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436293 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436324 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436340 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436357 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436372 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436427 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436444 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436461 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436477 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436492 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436523 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436552 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436568 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436597 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436612 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436628 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436670 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436690 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436723 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436740 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436756 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436773 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436789 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436806 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436823 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436839 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436858 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436875 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436914 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436929 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436946 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436962 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436977 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.436989 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437022 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437040 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437058 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437077 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437095 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437131 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437165 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437182 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437199 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437217 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437232 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437249 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437266 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437301 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437321 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437337 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437352 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437368 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437384 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437401 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437417 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437434 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437469 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437485 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437516 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437532 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437548 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437599 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437616 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437651 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437670 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437687 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437703 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437719 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437735 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437753 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437884 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437906 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437923 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437939 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437956 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437972 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437989 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438005 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438020 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438036 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438052 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438070 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438089 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438106 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438122 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438138 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438155 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438173 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438208 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438225 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438242 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438260 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438301 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438319 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438337 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438357 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438373 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438390 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438422 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438445 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438462 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438479 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438497 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438547 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438564 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438603 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438619 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438636 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438669 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438686 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438702 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438720 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438755 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438771 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438788 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438813 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438846 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438883 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438901 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438996 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439019 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439039 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439077 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439098 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439220 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439250 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439262 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.442580 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437039 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437162 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437173 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437300 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437408 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437432 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437542 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437567 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437603 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437620 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437701 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437734 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437725 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437797 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.437969 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.446502 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438023 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438122 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438135 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438187 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438274 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438283 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438261 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438334 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438454 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438474 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438526 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438583 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438590 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438662 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438708 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438843 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438884 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.438936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439066 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439409 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439579 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439695 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439821 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.439863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.440117 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.440316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.440386 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.440451 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.440629 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.443418 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.443800 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.444100 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.444382 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.444441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.444679 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.445063 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.445386 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.445621 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.445783 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.445927 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.446882 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.446979 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447264 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447470 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447704 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447842 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.447851 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.448217 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.448269 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.448544 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.448806 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.449047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.449307 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.449558 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.449709 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.449725 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.450172 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.450193 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.450264 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.450326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.450995 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.451594 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.451800 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.451987 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452008 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452176 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452194 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452449 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.452910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.451755 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.453594 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.453956 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.453965 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.454085 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.454184 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.454582 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.454618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.456049 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.456362 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.458267 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.458916 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.459145 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.459318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.459481 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.459854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.460082 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.460275 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.460443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.461618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.461787 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.462114 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.462448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.462647 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.463331 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.463747 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.464076 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.468439 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.468447 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.468817 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.468820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.468840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469084 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469109 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469231 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469292 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469586 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.469648 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470445 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470460 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470639 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470695 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470706 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.470799 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.471273 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.471842 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.471875 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.472148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.472359 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.472561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.472716 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.472864 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.473610 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:05:30.973589698 +0000 UTC m=+19.970051047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.473838 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.473888 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:30.973880106 +0000 UTC m=+19.970341455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.474109 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.474197 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:30.974170222 +0000 UTC m=+19.970631571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.474280 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.474286 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.474489 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.474596 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.474792 4830 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.475315 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.475554 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.476816 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.476971 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.477398 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.477811 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.478402 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.479244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482207 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.481997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482034 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482406 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482451 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482497 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482679 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482754 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482765 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482862 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.482995 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.483081 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.483296 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.492568 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.492751 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.493230 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.493255 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.493269 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.493344 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:30.993324231 +0000 UTC m=+19.989785580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.495148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.495249 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.495938 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.498573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.498947 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.499904 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.504908 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.504931 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.504941 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:30 crc kubenswrapper[4830]: E1203 22:05:30.505000 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:31.004980847 +0000 UTC m=+20.001442196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.523461 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.535197 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.542957 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.542994 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543088 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543107 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543116 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543124 4830 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543135 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543143 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543152 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543160 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543168 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543176 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543184 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543191 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543199 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543208 4830 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543216 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543225 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543233 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543241 4830 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543249 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543257 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543266 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543273 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543282 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543292 4830 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543299 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543307 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543314 4830 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543338 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543346 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543353 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543362 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543369 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543384 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543391 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543399 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543408 4830 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543416 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543429 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543437 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543445 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543452 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543460 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543468 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543475 4830 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543486 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543495 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543518 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543528 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543536 4830 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543547 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543556 4830 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543564 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543576 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543583 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543591 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543598 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543606 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543617 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543624 4830 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543633 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543641 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543649 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543657 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543665 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543672 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543680 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543687 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543695 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543702 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543710 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543717 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543731 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543739 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543751 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543758 4830 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543766 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543774 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543782 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543791 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543802 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543810 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543821 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543830 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543838 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543846 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543854 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543861 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543869 4830 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543879 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543886 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543894 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543902 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543909 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543917 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543925 4830 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543936 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543945 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543953 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543963 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543971 4830 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543978 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543985 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.543994 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544001 4830 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544010 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544017 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544025 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544033 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544043 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544057 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544065 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544077 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544085 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544093 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544100 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544108 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544116 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544123 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544131 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544138 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544150 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544157 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544170 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544178 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544185 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544193 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544204 4830 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544212 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544220 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544228 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544236 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544256 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544265 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544272 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544280 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544288 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544296 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544303 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544311 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544318 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544329 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544336 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544344 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544351 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544362 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544379 4830 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544386 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544394 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544401 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544408 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544416 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544424 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544432 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544439 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544447 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544455 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544463 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544470 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544478 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544486 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544494 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544519 4830 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544527 4830 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544534 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544546 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544554 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544577 4830 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544593 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544600 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544608 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544624 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544632 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544639 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544647 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544655 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544663 4830 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544671 4830 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544679 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544695 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544702 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544720 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544742 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544750 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544758 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544766 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544774 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544782 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544797 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544805 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544815 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544823 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544865 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.544981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.545326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.546135 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.580932 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.589959 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.607046 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.645093 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.913174 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pr4cr"] Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.913453 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.915036 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.916783 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.917220 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.924283 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.934985 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.945591 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.962012 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:30 crc kubenswrapper[4830]: I1203 22:05:30.979112 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.003066 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.021764 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048098 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048138 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-hosts-file\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznjq\" (UniqueName: \"kubernetes.io/projected/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-kube-api-access-qznjq\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.048172 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048238 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048290 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:05:32.048244509 +0000 UTC m=+21.044705858 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048344 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:32.048330911 +0000 UTC m=+21.044792260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048293 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048376 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048404 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048428 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048475 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048485 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:32.048478094 +0000 UTC m=+21.044939443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048411 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048522 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048562 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:32.048532975 +0000 UTC m=+21.044994494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:31 crc kubenswrapper[4830]: E1203 22:05:31.048592 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:32.048579537 +0000 UTC m=+21.045041116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.149467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-hosts-file\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.149544 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qznjq\" (UniqueName: \"kubernetes.io/projected/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-kube-api-access-qznjq\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.149737 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-hosts-file\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.167117 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznjq\" (UniqueName: \"kubernetes.io/projected/b2b0660f-4a2e-4d96-829a-fc54cbf92f0b-kube-api-access-qznjq\") pod \"node-resolver-pr4cr\" (UID: \"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\") " pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.224790 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pr4cr" Dec 03 22:05:31 crc kubenswrapper[4830]: W1203 22:05:31.240584 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b0660f_4a2e_4d96_829a_fc54cbf92f0b.slice/crio-ed59fab62334405c245ec497eca2ca667724f8d4734e742f6d43efb03becc483 WatchSource:0}: Error finding container ed59fab62334405c245ec497eca2ca667724f8d4734e742f6d43efb03becc483: Status 404 returned error can't find the container with id ed59fab62334405c245ec497eca2ca667724f8d4734e742f6d43efb03becc483 Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.299681 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-nfl7k"] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.301182 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-sh485"] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.301452 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.302003 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.305297 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5vgkl"] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.306555 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.306649 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.306695 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.307401 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.321305 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.321485 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.322547 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.322747 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.323012 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.323282 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.323588 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.324178 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.324467 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.324730 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.324906 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.325095 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.325257 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.327707 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.331692 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wdcn6"] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.333804 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.335143 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.336190 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.341715 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.342443 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.344129 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.345039 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.346681 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.347473 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.348256 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.349848 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.350694 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.352105 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.352425 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.352845 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.354637 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.355300 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.356024 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.357302 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.358081 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.359362 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.359900 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.360752 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.362058 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.362473 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.363380 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.363809 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.364809 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.365212 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.365792 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.366794 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.367249 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.368159 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.368625 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.369484 4830 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.369600 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.371196 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.372023 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.372459 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.373980 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.374591 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.375430 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.375928 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.376053 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.377031 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.377632 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.378804 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.379373 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.380409 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.380869 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.381820 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.382288 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.383348 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.383849 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.384628 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.385079 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.386019 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.386563 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.387009 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.398250 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.430739 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.454968 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-mcd-auth-proxy-config\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.454999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455017 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-netns\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455031 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-bin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455047 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455062 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455085 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-multus\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455099 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-kubelet\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455115 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455128 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455150 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455163 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455179 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-system-cni-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-k8s-cni-cncf-io\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455222 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-cnibin\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455238 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-rootfs\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-os-release\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455307 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-os-release\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455321 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-socket-dir-parent\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4kbl\" (UniqueName: \"kubernetes.io/projected/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-kube-api-access-c4kbl\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455365 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455381 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455408 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtl7l\" (UniqueName: \"kubernetes.io/projected/080247dd-b7ea-44e0-9145-da0eeade0107-kube-api-access-wtl7l\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-hostroot\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455463 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-multus-daemon-config\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455493 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-multus-certs\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455521 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-proxy-tls\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455535 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455551 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455564 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455576 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-conf-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455590 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-etc-kubernetes\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455605 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsqf\" (UniqueName: \"kubernetes.io/projected/bdccedf8-f580-49f0-848e-108c748d8a21-kube-api-access-blsqf\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455632 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455645 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktrh\" (UniqueName: \"kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455658 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-system-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455685 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-cni-binary-copy\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455699 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.455720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-cnibin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.476611 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.484811 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.484854 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.484863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fe4efcb3065d8345fc62a24595c8dedf53fcf7402caba1a9016fa298fa1dcc4"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.487242 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.489643 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d" exitCode=255 Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.489682 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.491098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pr4cr" event={"ID":"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b","Type":"ContainerStarted","Data":"ed59fab62334405c245ec497eca2ca667724f8d4734e742f6d43efb03becc483"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.493103 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"eabe3ad641c4fa86ae3a5974c221689306c1343cf43c36947324bf7f9d224d4e"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.496799 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.496837 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f18be4a726453ef9701b869512fd125a3952045c091725e9e77635d471ed29c3"} Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.500193 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.518878 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.528257 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.528426 4830 scope.go:117] "RemoveContainer" containerID="90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.540536 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-k8s-cni-cncf-io\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-cnibin\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-rootfs\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557192 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-os-release\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-os-release\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-socket-dir-parent\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557267 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4kbl\" (UniqueName: \"kubernetes.io/projected/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-kube-api-access-c4kbl\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557284 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557316 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557332 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557351 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtl7l\" (UniqueName: \"kubernetes.io/projected/080247dd-b7ea-44e0-9145-da0eeade0107-kube-api-access-wtl7l\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557366 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-hostroot\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557433 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-multus-daemon-config\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-multus-certs\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557462 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-proxy-tls\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.557616 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-k8s-cni-cncf-io\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558022 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558164 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-cnibin\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558239 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558325 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558342 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-conf-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-etc-kubernetes\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blsqf\" (UniqueName: \"kubernetes.io/projected/bdccedf8-f580-49f0-848e-108c748d8a21-kube-api-access-blsqf\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sktrh\" (UniqueName: \"kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-system-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558496 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-cnibin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-cni-binary-copy\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-mcd-auth-proxy-config\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558642 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558660 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-netns\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558676 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-bin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558692 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558869 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-system-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558924 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558949 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558920 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558983 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.558984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-hostroot\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-cni-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-conf-dir\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559673 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-mcd-auth-proxy-config\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559699 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-etc-kubernetes\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.559750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-cnibin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.565099 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-multus-daemon-config\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.565610 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bdccedf8-f580-49f0-848e-108c748d8a21-cni-binary-copy\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.565649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.565720 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.566245 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-proxy-tls\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.566683 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.569782 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-multus\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570544 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-multus\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-multus-certs\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-run-netns\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570923 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/080247dd-b7ea-44e0-9145-da0eeade0107-cni-binary-copy\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.570943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571236 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-os-release\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571272 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-multus-socket-dir-parent\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571282 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-os-release\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571325 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-kubelet\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-system-cni-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571633 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571661 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-cni-bin\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571663 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdccedf8-f580-49f0-848e-108c748d8a21-host-var-lib-kubelet\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/080247dd-b7ea-44e0-9145-da0eeade0107-system-cni-dir\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571680 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571698 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571707 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571720 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571733 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.571747 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-rootfs\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.588886 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.594181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktrh\" (UniqueName: \"kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh\") pod \"ovnkube-node-5vgkl\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.598669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4kbl\" (UniqueName: \"kubernetes.io/projected/d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad-kube-api-access-c4kbl\") pod \"machine-config-daemon-nfl7k\" (UID: \"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\") " pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.599162 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtl7l\" (UniqueName: \"kubernetes.io/projected/080247dd-b7ea-44e0-9145-da0eeade0107-kube-api-access-wtl7l\") pod \"multus-additional-cni-plugins-wdcn6\" (UID: \"080247dd-b7ea-44e0-9145-da0eeade0107\") " pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.599987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsqf\" (UniqueName: \"kubernetes.io/projected/bdccedf8-f580-49f0-848e-108c748d8a21-kube-api-access-blsqf\") pod \"multus-sh485\" (UID: \"bdccedf8-f580-49f0-848e-108c748d8a21\") " pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.614062 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.628192 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.654641 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.668253 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sh485" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.680095 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.680482 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:05:31 crc kubenswrapper[4830]: W1203 22:05:31.683248 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdccedf8_f580_49f0_848e_108c748d8a21.slice/crio-d8da05419d7ee8f5e598012850f3a79e87564b8a315a5b87cab4c34b2711449c WatchSource:0}: Error finding container d8da05419d7ee8f5e598012850f3a79e87564b8a315a5b87cab4c34b2711449c: Status 404 returned error can't find the container with id d8da05419d7ee8f5e598012850f3a79e87564b8a315a5b87cab4c34b2711449c Dec 03 22:05:31 crc kubenswrapper[4830]: W1203 22:05:31.694104 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d9f02c_88ac_490a_8ec3_2f3fbb0ae3ad.slice/crio-ef7dc1e19d2b76242f9c6b79ea50c4e47573acec03055cc5deeca2900f8765de WatchSource:0}: Error finding container ef7dc1e19d2b76242f9c6b79ea50c4e47573acec03055cc5deeca2900f8765de: Status 404 returned error can't find the container with id ef7dc1e19d2b76242f9c6b79ea50c4e47573acec03055cc5deeca2900f8765de Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.697376 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.706136 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.710299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.731734 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: W1203 22:05:31.736632 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080247dd_b7ea_44e0_9145_da0eeade0107.slice/crio-2da1cfcc7e6454f7da4c1cda70123855ce8d7e468583cc6bde65a96406c19323 WatchSource:0}: Error finding container 2da1cfcc7e6454f7da4c1cda70123855ce8d7e468583cc6bde65a96406c19323: Status 404 returned error can't find the container with id 2da1cfcc7e6454f7da4c1cda70123855ce8d7e468583cc6bde65a96406c19323 Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.747468 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.769591 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.796909 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.820182 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.856116 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.895926 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.917822 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.932648 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.946396 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.961427 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.973861 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.990955 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:31 crc kubenswrapper[4830]: I1203 22:05:31.993917 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.013754 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.027900 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.051456 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.069461 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.075835 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.075900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.075922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.075948 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.075968 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076042 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:05:34.076009078 +0000 UTC m=+23.072470447 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076065 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076081 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076085 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076093 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076383 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076435 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076435 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:34.076415167 +0000 UTC m=+23.072876516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076459 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076501 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:34.076489709 +0000 UTC m=+23.072951058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076570 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:34.07655943 +0000 UTC m=+23.073020779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076590 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.076655 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:34.076628362 +0000 UTC m=+23.073089711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.335928 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.335979 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.336043 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.336072 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.336150 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.336211 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.501424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.501934 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.501950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"ef7dc1e19d2b76242f9c6b79ea50c4e47573acec03055cc5deeca2900f8765de"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.502782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerStarted","Data":"7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.502814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerStarted","Data":"d8da05419d7ee8f5e598012850f3a79e87564b8a315a5b87cab4c34b2711449c"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.507555 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" exitCode=0 Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.507638 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.507702 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"817569cb6e408227a7f1ee953f842f268ceb2eca9a84f6b3fa05f6e9d943f9d4"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.510433 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pr4cr" event={"ID":"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b","Type":"ContainerStarted","Data":"4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.511725 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b" exitCode=0 Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.511820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.511902 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerStarted","Data":"2da1cfcc7e6454f7da4c1cda70123855ce8d7e468583cc6bde65a96406c19323"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.520610 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.528258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6"} Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.528545 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.532216 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.556062 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.595358 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.609466 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.629831 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.644876 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.666720 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.681738 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.696910 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.717647 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.737788 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.758294 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.777115 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.797937 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.817338 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.832791 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.849868 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.865587 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wzd28"] Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.866314 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:32 crc kubenswrapper[4830]: W1203 22:05:32.872885 4830 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.872932 4830 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 22:05:32 crc kubenswrapper[4830]: W1203 22:05:32.872981 4830 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 03 22:05:32 crc kubenswrapper[4830]: E1203 22:05:32.872992 4830 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.873138 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.873337 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.875195 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.900162 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.914378 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.930797 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.945837 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.960775 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.976080 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.984669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b81e3c78-e222-410b-8cca-a4ba48f72f87-serviceca\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.984751 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b81e3c78-e222-410b-8cca-a4ba48f72f87-host\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.984796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtbv\" (UniqueName: \"kubernetes.io/projected/b81e3c78-e222-410b-8cca-a4ba48f72f87-kube-api-access-gwtbv\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:32 crc kubenswrapper[4830]: I1203 22:05:32.990245 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:32Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.006117 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.019683 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.035100 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.049740 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.085719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b81e3c78-e222-410b-8cca-a4ba48f72f87-host\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.085755 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtbv\" (UniqueName: \"kubernetes.io/projected/b81e3c78-e222-410b-8cca-a4ba48f72f87-kube-api-access-gwtbv\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.085781 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b81e3c78-e222-410b-8cca-a4ba48f72f87-serviceca\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.085910 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b81e3c78-e222-410b-8cca-a4ba48f72f87-host\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.087124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b81e3c78-e222-410b-8cca-a4ba48f72f87-serviceca\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.087787 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.105712 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.120472 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.134332 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.148793 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.163609 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.181922 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.195243 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541896 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.541963 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.545670 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.549239 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856" exitCode=0 Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.549903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856"} Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.561366 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.576777 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.591628 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.611707 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.629354 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.649568 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.670604 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.683055 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.695247 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.704478 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.706560 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtbv\" (UniqueName: \"kubernetes.io/projected/b81e3c78-e222-410b-8cca-a4ba48f72f87-kube-api-access-gwtbv\") pod \"node-ca-wzd28\" (UID: \"b81e3c78-e222-410b-8cca-a4ba48f72f87\") " pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.717772 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.733558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.747781 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.759827 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.774714 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.789075 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.808169 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.825684 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.851360 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.873612 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.885757 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.902692 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.920040 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.934824 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:33 crc kubenswrapper[4830]: I1203 22:05:33.953680 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.004313 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:33Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.033587 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.095106 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095272 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:05:38.095241104 +0000 UTC m=+27.091702453 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.095356 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.095396 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.095445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.095467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095536 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095606 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:38.095588472 +0000 UTC m=+27.092049821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095615 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095667 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:38.095657623 +0000 UTC m=+27.092119052 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095672 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095689 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095701 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095753 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:38.095744265 +0000 UTC m=+27.092205614 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095775 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095834 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095858 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.095957 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:38.09592852 +0000 UTC m=+27.092389889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.224618 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.239174 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.239294 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.240420 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.256496 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.268868 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.287565 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.301379 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.314054 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.329600 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.336190 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.336190 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.336329 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.336397 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.336757 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.336856 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.343173 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.362616 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.383955 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.394994 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.399968 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wzd28" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.414110 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: W1203 22:05:34.414679 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81e3c78_e222_410b_8cca_a4ba48f72f87.slice/crio-d16afe31a459c51557188c4dc64972f200a4199a3e2931c1bd7d8f67a0b22c10 WatchSource:0}: Error finding container d16afe31a459c51557188c4dc64972f200a4199a3e2931c1bd7d8f67a0b22c10: Status 404 returned error can't find the container with id d16afe31a459c51557188c4dc64972f200a4199a3e2931c1bd7d8f67a0b22c10 Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.428200 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.448425 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.463147 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.478108 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.493466 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.508494 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.519126 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.536576 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.539050 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.542485 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.557074 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9" exitCode=0 Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.557178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9"} Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.558741 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.558871 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wzd28" event={"ID":"b81e3c78-e222-410b-8cca-a4ba48f72f87","Type":"ContainerStarted","Data":"d16afe31a459c51557188c4dc64972f200a4199a3e2931c1bd7d8f67a0b22c10"} Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.569778 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 22:05:34 crc kubenswrapper[4830]: E1203 22:05:34.611489 4830 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.633677 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.672046 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.709902 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.753050 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.806196 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.833341 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.873651 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.917088 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:34 crc kubenswrapper[4830]: I1203 22:05:34.954984 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.000083 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:34Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.036442 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.077145 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.127770 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.155833 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.193116 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.247981 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.277074 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.313137 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.391169 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.409104 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.440988 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.479002 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.567699 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.570484 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364" exitCode=0 Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.570562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364"} Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.573650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wzd28" event={"ID":"b81e3c78-e222-410b-8cca-a4ba48f72f87","Type":"ContainerStarted","Data":"e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc"} Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.587585 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.609370 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.626136 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.638993 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.672682 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.713204 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.758005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.792480 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.834159 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.887936 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.918816 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:35 crc kubenswrapper[4830]: I1203 22:05:35.972045 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.001382 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:35Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.030306 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.073687 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.115307 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.153608 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.190944 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.234005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.274772 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.310159 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.343027 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.343152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.343223 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.343268 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.343400 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.343538 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.343670 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.345825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.345894 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.345916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.346150 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.359218 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.405132 4830 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.405449 4830 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.406475 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.406569 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.406588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.406612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.406635 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.426809 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.431633 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.432498 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.432599 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.432622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.432666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.432693 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.448349 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.453212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.453249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.453260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.453279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.453290 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.466884 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.469825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.469855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.469866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.469880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.469889 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.472795 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.481125 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.484712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.484781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.484798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.484822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.484836 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.497445 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: E1203 22:05:36.497669 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.498976 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.499017 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.499032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.499048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.499061 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.515829 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.552448 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.580003 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332" exitCode=0 Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.580091 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.600216 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.601232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.601286 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.601302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.601324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.601340 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.639593 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.675061 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.704854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.704888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.704898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.704910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.704919 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.715789 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.755104 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.795277 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.808095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.808131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.808142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.808158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.808169 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.831468 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.872166 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.911289 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.911348 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.911360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.911381 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.911398 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:36Z","lastTransitionTime":"2025-12-03T22:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.919938 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.954806 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:36 crc kubenswrapper[4830]: I1203 22:05:36.990004 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:36Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.013418 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.013447 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.013455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.013468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.013478 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.037216 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.078361 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.110891 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.115565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.115604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.115613 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.115627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.115636 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.153302 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.192551 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.218289 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.218340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.218352 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.218373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.218385 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.237153 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.274058 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.317555 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.321284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.321321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.321332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.321349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.321361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.423615 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.423660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.423672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.423691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.423703 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.526765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.526820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.526836 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.526862 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.526879 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.586353 4830 generic.go:334] "Generic (PLEG): container finished" podID="080247dd-b7ea-44e0-9145-da0eeade0107" containerID="7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3" exitCode=0 Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.586411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerDied","Data":"7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.621150 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.630573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.630628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.630645 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.630670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.630691 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.648157 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.662701 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.679065 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.695937 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.709999 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.724743 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.734375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.734432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.734450 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.734477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.734497 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.739037 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.754802 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.782785 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.806734 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.826947 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.838543 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.838613 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.838626 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.838643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.838654 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.845164 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.871565 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.915831 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:37Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.941382 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.941441 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.941458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.941482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:37 crc kubenswrapper[4830]: I1203 22:05:37.941499 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:37Z","lastTransitionTime":"2025-12-03T22:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.044623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.044681 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.044698 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.044722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.044740 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.138020 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.138201 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138298 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138312 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.138269442 +0000 UTC m=+35.134730821 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138372 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.138349804 +0000 UTC m=+35.134811193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.138423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.138583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138600 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138635 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138659 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138717 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.138696662 +0000 UTC m=+35.135158051 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.138634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138757 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138803 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138863 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138940 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.138917037 +0000 UTC m=+35.135378476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.138761 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.139007 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.138992139 +0000 UTC m=+35.135453518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.149297 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.149355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.149374 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.149397 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.149415 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.252591 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.252660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.252683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.252712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.252734 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.336327 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.336327 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.336572 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.336654 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.336341 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:38 crc kubenswrapper[4830]: E1203 22:05:38.336757 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.356401 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.356454 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.356546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.356581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.356606 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.459996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.460050 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.460068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.460093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.460111 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.563299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.563355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.563373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.563397 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.563415 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.598876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.604923 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" event={"ID":"080247dd-b7ea-44e0-9145-da0eeade0107","Type":"ContainerStarted","Data":"23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.628449 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.642169 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.654570 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.666899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.666946 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.666972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.667003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.667027 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.673263 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.691933 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.708455 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.740236 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.760136 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.769490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.769565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.769582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.769606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.769623 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.776547 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.793069 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.810392 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.839325 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.862325 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.872466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.872568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.872588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.872617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.872636 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.878247 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.899633 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.975601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.975664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.975680 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.975708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:38 crc kubenswrapper[4830]: I1203 22:05:38.975725 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:38Z","lastTransitionTime":"2025-12-03T22:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.078844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.078898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.078916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.078940 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.078960 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.182454 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.182503 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.182555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.182579 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.182595 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.285593 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.285664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.285685 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.285709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.285725 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.389502 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.389620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.389636 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.389659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.389676 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.493601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.494162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.494254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.500199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.500335 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.603247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.604373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.604497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.604656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.604788 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.608373 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.608449 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.624555 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.642337 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.642585 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.643758 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.658689 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.673719 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.691076 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.707934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.707980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.707994 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.708019 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.708058 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.721421 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.738586 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.758480 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.777083 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.794615 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811207 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811860 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811923 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811940 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.811978 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.826704 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.851443 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.870392 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.884352 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.916015 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.916094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.916109 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.916138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.916158 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:39Z","lastTransitionTime":"2025-12-03T22:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.915979 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.935322 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.952502 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.969018 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:39 crc kubenswrapper[4830]: I1203 22:05:39.989191 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:39Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.008117 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.019556 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.019628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.019653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.019683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.019705 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.024837 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.042039 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.069530 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.092382 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.115881 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.123145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.123244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.123262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.123287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.123307 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.139244 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.162356 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.181332 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.209046 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:40Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.225764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.225801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.225808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.225822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.225830 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.328632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.328683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.328707 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.328724 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.328733 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.336412 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.336463 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.338406 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:40 crc kubenswrapper[4830]: E1203 22:05:40.338564 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:40 crc kubenswrapper[4830]: E1203 22:05:40.338709 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:40 crc kubenswrapper[4830]: E1203 22:05:40.338959 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.431479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.431531 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.431549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.431564 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.431573 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.533820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.533881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.533899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.533924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.533942 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.611440 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.637262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.637310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.637330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.637357 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.637379 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.740137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.740196 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.740219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.740244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.740263 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.842491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.842539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.842548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.842561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.842570 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.944349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.944386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.944394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.944408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:40 crc kubenswrapper[4830]: I1203 22:05:40.944416 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:40Z","lastTransitionTime":"2025-12-03T22:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.046839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.046901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.046916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.046938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.046954 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.149458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.149579 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.149605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.149635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.149668 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.253551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.253620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.253641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.253664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.253679 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.356688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.356752 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.356774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.356801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.356820 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.358878 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.376124 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.398088 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.419348 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.442462 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.460643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.460708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.460727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.460753 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.460779 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.464097 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.491268 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.506768 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.526023 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.540375 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.552302 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.563250 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.563330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.563355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.563388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.563411 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.564622 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.605555 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.614829 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.621268 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.637386 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.666383 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.666433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.666451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.666472 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.666488 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.770839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.770965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.770982 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.771008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.771025 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.874647 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.874720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.874738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.874766 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.874789 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.977669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.977904 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.978000 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.978033 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:41 crc kubenswrapper[4830]: I1203 22:05:41.978052 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:41Z","lastTransitionTime":"2025-12-03T22:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.081995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.082066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.082084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.082112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.082131 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.185325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.185384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.185402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.185464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.185484 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.288899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.289008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.289032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.289059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.289079 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.336647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.336680 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.336693 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:42 crc kubenswrapper[4830]: E1203 22:05:42.336853 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:42 crc kubenswrapper[4830]: E1203 22:05:42.337048 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:42 crc kubenswrapper[4830]: E1203 22:05:42.337193 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.391861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.391960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.391988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.392024 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.392048 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.494817 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.494870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.494882 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.494901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.494914 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.598160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.598383 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.598435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.598470 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.598493 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.620904 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/0.log" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.626087 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d" exitCode=1 Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.626158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.627272 4830 scope.go:117] "RemoveContainer" containerID="be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.643061 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.667273 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.681032 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.699643 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.700659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.700868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.701084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.701290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.701416 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.713325 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.733735 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.746191 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.755905 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.775031 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:41Z\\\",\\\"message\\\":\\\"olicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.893502 6111 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894122 6111 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.894273 6111 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894394 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:05:40.894452 6111 factory.go:656] Stopping watch factory\\\\nI1203 22:05:40.894450 6111 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.894470 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:05:40.894741 6111 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.895436 6111 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.788492 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.798364 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.804391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.804426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.804437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.804453 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.804464 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.812690 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.824358 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.836630 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.847176 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:42Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.910752 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.910825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.910846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.910875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:42 crc kubenswrapper[4830]: I1203 22:05:42.910898 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:42Z","lastTransitionTime":"2025-12-03T22:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.014459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.014618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.014642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.014673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.014693 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.117866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.117924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.117942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.117967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.117986 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.221621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.221709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.221738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.221776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.222150 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.325320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.325390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.325412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.325441 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.325464 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.428699 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.428764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.428786 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.428816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.428840 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.532092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.532145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.532179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.532209 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.532228 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.634835 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.634888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.634904 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.634923 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.634955 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.739102 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.739168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.739186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.739210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.739227 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.842059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.842124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.842143 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.842168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.842186 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.944442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.944480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.944492 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.944537 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:43 crc kubenswrapper[4830]: I1203 22:05:43.944554 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:43Z","lastTransitionTime":"2025-12-03T22:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.046983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.047026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.047037 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.047053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.047064 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.150378 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.150435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.150477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.150527 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.150548 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.249637 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v"] Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.250076 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.251894 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252707 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252660 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.252752 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.270642 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.284609 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.296804 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.306042 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.311546 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.318034 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.330899 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.334944 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.335026 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.335144 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.335179 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dc4\" (UniqueName: \"kubernetes.io/projected/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-kube-api-access-w4dc4\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.340223 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.340297 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:44 crc kubenswrapper[4830]: E1203 22:05:44.340425 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.340223 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:44 crc kubenswrapper[4830]: E1203 22:05:44.340907 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:44 crc kubenswrapper[4830]: E1203 22:05:44.341064 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.350249 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.354469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.354535 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.354549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.354567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.354580 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.375826 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.394789 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.408546 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.424986 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.435815 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.435877 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.435932 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.435960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dc4\" (UniqueName: \"kubernetes.io/projected/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-kube-api-access-w4dc4\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.436584 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.437484 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.437863 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.448431 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.457221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.457262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.457278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.457295 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.457308 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.471146 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dc4\" (UniqueName: \"kubernetes.io/projected/a19b7b77-9efe-4ebf-b9a4-f6253923cbc7-kube-api-access-w4dc4\") pod \"ovnkube-control-plane-749d76644c-9tz2v\" (UID: \"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.479390 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:41Z\\\",\\\"message\\\":\\\"olicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.893502 6111 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894122 6111 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.894273 6111 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894394 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:05:40.894452 6111 factory.go:656] Stopping watch factory\\\\nI1203 22:05:40.894450 6111 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.894470 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:05:40.894741 6111 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.895436 6111 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.512629 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.538077 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.550698 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.559057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.559096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.559108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.559125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.559137 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.566340 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.637824 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/0.log" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.641738 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.642193 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.647849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" event={"ID":"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7","Type":"ContainerStarted","Data":"6d003998890c346af71c5fe4c784936b9cc54ed7e0af3878038556d9ab8d1449"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.660908 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.662192 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.662240 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.662255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.662272 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.662286 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.671165 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.684595 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.695454 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.705872 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.717905 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.739029 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:41Z\\\",\\\"message\\\":\\\"olicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.893502 6111 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894122 6111 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.894273 6111 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894394 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:05:40.894452 6111 factory.go:656] Stopping watch factory\\\\nI1203 22:05:40.894450 6111 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.894470 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:05:40.894741 6111 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.895436 6111 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.750998 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.764601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.764651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.764665 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.764688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.764705 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.771203 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.785148 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.795605 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.803113 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.813604 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.831299 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.843313 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.856005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.867527 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.867575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.867587 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.867608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.867624 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.971416 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.971501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.971559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.971590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:44 crc kubenswrapper[4830]: I1203 22:05:44.971612 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:44Z","lastTransitionTime":"2025-12-03T22:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.075208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.075270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.075287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.075311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.075328 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.178355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.178407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.178424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.178447 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.178464 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.281737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.281819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.281867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.281892 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.281909 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.389826 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.390620 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zlcmr"] Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.390879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.390928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.391025 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.391072 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.391243 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: E1203 22:05:45.391322 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.420547 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.444492 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.447223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.447341 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggws\" (UniqueName: \"kubernetes.io/projected/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-kube-api-access-bggws\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.461349 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.482798 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.494424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.494461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.494487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.494521 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.494532 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.505722 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.520548 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.536485 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.547958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.548070 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggws\" (UniqueName: \"kubernetes.io/projected/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-kube-api-access-bggws\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: E1203 22:05:45.548152 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:45 crc kubenswrapper[4830]: E1203 22:05:45.548238 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:46.048204403 +0000 UTC m=+35.044665772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.553317 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.565007 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.569221 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggws\" (UniqueName: \"kubernetes.io/projected/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-kube-api-access-bggws\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.582569 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.597656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.597710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.597726 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.597748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.597761 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.598100 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.616283 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.649337 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.654799 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/1.log" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.655728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/0.log" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.659352 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e" exitCode=1 Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.659452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.659547 4830 scope.go:117] "RemoveContainer" containerID="be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.660727 4830 scope.go:117] "RemoveContainer" containerID="bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e" Dec 03 22:05:45 crc kubenswrapper[4830]: E1203 22:05:45.660998 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.661857 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" event={"ID":"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7","Type":"ContainerStarted","Data":"e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.661901 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" event={"ID":"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7","Type":"ContainerStarted","Data":"173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.685501 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:41Z\\\",\\\"message\\\":\\\"olicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.893502 6111 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894122 6111 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.894273 6111 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894394 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:05:40.894452 6111 factory.go:656] Stopping watch factory\\\\nI1203 22:05:40.894450 6111 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.894470 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:05:40.894741 6111 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.895436 6111 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.700270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.700321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.700332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.700353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.700365 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.701831 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.715561 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.728263 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.739995 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.758545 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.775376 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.793963 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.803410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.803451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.803462 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.803494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.803524 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.810632 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.830456 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be702a1cfde866753543064975ae704a056ddfa6ed0569f1fe03b00e070a004d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:41Z\\\",\\\"message\\\":\\\"olicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.893502 6111 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894122 6111 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:05:40.894273 6111 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 22:05:40.894394 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:05:40.894452 6111 factory.go:656] Stopping watch factory\\\\nI1203 22:05:40.894450 6111 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.894470 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:05:40.894741 6111 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:05:40.895436 6111 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.847830 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.862130 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.878719 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.892625 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906177 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906649 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.906728 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:45Z","lastTransitionTime":"2025-12-03T22:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.921803 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.934001 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.948586 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.975974 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:45 crc kubenswrapper[4830]: I1203 22:05:45.994910 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:45Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.008679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.008729 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.008744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.008767 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.008783 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.012700 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.053338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.053636 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.053748 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:47.053714822 +0000 UTC m=+36.050176341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.112047 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.112105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.112115 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.112136 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.112150 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.153971 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.154136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154227 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:06:02.154185937 +0000 UTC m=+51.150647426 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154263 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154289 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154302 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.154313 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154361 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:02.154341271 +0000 UTC m=+51.150802830 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.154410 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.154443 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154454 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154498 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:02.154489544 +0000 UTC m=+51.150951133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154543 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154558 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154568 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154608 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:02.154598897 +0000 UTC m=+51.151060476 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154667 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.154792 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:02.154761801 +0000 UTC m=+51.151223160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.215636 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.215699 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.215720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.215745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.215761 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.318754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.318794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.318804 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.318820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.318829 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.336761 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.336864 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.336882 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.337015 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.337236 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.337390 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.421627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.421706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.421730 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.421762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.421803 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.524910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.525000 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.525018 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.525042 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.525061 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.629217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.629288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.629308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.629330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.629348 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.668783 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/1.log" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.674761 4830 scope.go:117] "RemoveContainer" containerID="bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.675044 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.692933 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.711977 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.733007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.733112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.733134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.733159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.733178 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.739122 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.758760 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.771626 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.786560 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.800366 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.814371 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.826206 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.835213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.835252 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.835264 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.835282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.835294 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.849383 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.865630 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.880261 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.892352 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.895338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.895372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.895384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.895400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.895411 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.905474 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.910965 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.914270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.914310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.914321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.914338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.914347 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.924445 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.926032 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.929060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.929094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.929105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.929118 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.929127 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.938687 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.940755 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.943847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.943884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.943895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.943912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.943924 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.949891 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.955956 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.959106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.959138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.959147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.959159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.959168 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.970541 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:46Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:46 crc kubenswrapper[4830]: E1203 22:05:46.970696 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.972248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.972275 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.972286 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.972304 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:46 crc kubenswrapper[4830]: I1203 22:05:46.972314 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:46Z","lastTransitionTime":"2025-12-03T22:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.064620 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:47 crc kubenswrapper[4830]: E1203 22:05:47.064739 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:47 crc kubenswrapper[4830]: E1203 22:05:47.064789 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:49.064774993 +0000 UTC m=+38.061236342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.074465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.074518 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.074530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.074545 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.074554 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.177023 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.177063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.177072 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.177087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.177097 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.280393 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.280454 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.280466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.280484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.280499 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.336258 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:47 crc kubenswrapper[4830]: E1203 22:05:47.336442 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.383856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.383914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.383932 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.383957 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.383979 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.487812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.487867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.487883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.487905 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.487921 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.590499 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.590572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.590588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.590610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.590627 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.693405 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.693485 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.693503 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.693568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.693588 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.796621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.796678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.796695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.796723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.796739 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.899872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.899946 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.899966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.899989 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:47 crc kubenswrapper[4830]: I1203 22:05:47.900007 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:47Z","lastTransitionTime":"2025-12-03T22:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.002617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.002675 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.002690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.002708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.002720 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.105777 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.105844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.105867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.105896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.105921 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.207898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.207963 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.207982 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.208006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.208023 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.310164 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.310195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.310202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.310216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.310226 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.336020 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.336080 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:48 crc kubenswrapper[4830]: E1203 22:05:48.336248 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.336636 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:48 crc kubenswrapper[4830]: E1203 22:05:48.336785 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:48 crc kubenswrapper[4830]: E1203 22:05:48.336916 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.412829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.412896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.412914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.412938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.412956 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.516085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.516219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.516242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.516265 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.516283 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.621271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.621313 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.621324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.621340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.621352 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.723090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.723127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.723139 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.723158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.723170 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.827349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.827403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.827420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.827444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.827461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.930420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.930966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.931189 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.931395 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:48 crc kubenswrapper[4830]: I1203 22:05:48.931576 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:48Z","lastTransitionTime":"2025-12-03T22:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.035077 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.035152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.035180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.035210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.035236 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.065948 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.086370 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:49 crc kubenswrapper[4830]: E1203 22:05:49.086626 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:49 crc kubenswrapper[4830]: E1203 22:05:49.086721 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:05:53.086701661 +0000 UTC m=+42.083163050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.092590 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.116352 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.134830 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.138840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.139112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.139379 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.140400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.140451 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.160955 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.181715 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.203938 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.224556 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.243792 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.243844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.243861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.243885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.243902 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.246700 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.271284 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.287004 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.307637 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.325449 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.336864 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:49 crc kubenswrapper[4830]: E1203 22:05:49.337085 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.343251 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.347138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.347216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.347242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.347788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.347815 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.364313 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.397336 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.416013 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.432382 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:49Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.451461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.451568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.451603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.451635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.451653 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.553459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.553592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.553622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.553662 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.553685 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.656875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.656938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.656960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.656984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.657002 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.759316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.759398 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.759422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.759445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.759461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.862641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.862700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.862717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.862742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.862759 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.965937 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.966038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.966064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.966092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:49 crc kubenswrapper[4830]: I1203 22:05:49.966114 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:49Z","lastTransitionTime":"2025-12-03T22:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.069220 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.069290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.069307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.069332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.069350 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.172913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.173239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.173451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.173668 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.173838 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.276374 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.276422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.276433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.276465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.276476 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.336953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.337027 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.337136 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:50 crc kubenswrapper[4830]: E1203 22:05:50.337133 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:50 crc kubenswrapper[4830]: E1203 22:05:50.337242 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:50 crc kubenswrapper[4830]: E1203 22:05:50.337327 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.380068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.380136 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.380155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.380181 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.380202 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.482649 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.482958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.483111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.483285 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.483583 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.587598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.587950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.588075 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.588183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.588292 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.689907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.689944 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.689953 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.689968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.689977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.793066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.793126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.793145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.793170 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.793187 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.896142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.896227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.896251 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.896284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.896321 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.999221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.999284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.999308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.999338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:50 crc kubenswrapper[4830]: I1203 22:05:50.999362 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:50Z","lastTransitionTime":"2025-12-03T22:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.102859 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.102915 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.102931 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.102954 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.102971 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.206551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.206614 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.206632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.206657 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.206676 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.310331 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.310715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.310885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.311210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.311446 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.337004 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:51 crc kubenswrapper[4830]: E1203 22:05:51.337264 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.361019 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.380344 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.398078 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.415138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.415395 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.415455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.415489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.415540 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.419736 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.446374 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.468346 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.482153 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519684 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519697 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519707 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.519930 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.535082 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.552261 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.568897 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.592144 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.610750 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.622104 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.622137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.622149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.622165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.622177 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.624728 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.640942 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.656249 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.671093 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.725032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.725071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.725080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.725096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.725106 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.828382 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.828465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.828477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.828501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.828542 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.931654 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.931720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.931740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.931766 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:51 crc kubenswrapper[4830]: I1203 22:05:51.931785 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:51Z","lastTransitionTime":"2025-12-03T22:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.034750 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.034787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.034797 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.034812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.034824 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.137967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.138030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.138048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.138070 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.138087 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.241056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.241099 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.241112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.241128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.241141 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.336730 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.336783 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.336730 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:52 crc kubenswrapper[4830]: E1203 22:05:52.336877 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:52 crc kubenswrapper[4830]: E1203 22:05:52.336971 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:52 crc kubenswrapper[4830]: E1203 22:05:52.337117 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.342960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.342997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.343013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.343035 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.343052 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.445607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.445644 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.445653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.445666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.445676 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.549167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.549219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.549235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.549260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.549278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.652245 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.652302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.652311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.652332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.652344 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.755983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.756036 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.756046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.756069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.756081 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.858691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.858733 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.858745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.858762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.858776 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.962429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.962489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.962501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.962543 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:52 crc kubenswrapper[4830]: I1203 22:05:52.962592 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:52Z","lastTransitionTime":"2025-12-03T22:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.065807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.065854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.065863 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.065883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.065895 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.132058 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:53 crc kubenswrapper[4830]: E1203 22:05:53.132268 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:53 crc kubenswrapper[4830]: E1203 22:05:53.132342 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:01.132318938 +0000 UTC m=+50.128780287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.168457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.168501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.168536 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.168554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.168565 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.271126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.271207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.271232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.271261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.271280 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.337062 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:53 crc kubenswrapper[4830]: E1203 22:05:53.337298 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.374207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.374277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.374293 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.374322 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.374340 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.477845 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.477889 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.477899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.477915 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.477926 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.581231 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.581366 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.581384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.581410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.581429 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.685700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.685791 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.685817 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.685854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.685896 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.788808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.788869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.788885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.788907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.788924 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.892209 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.892248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.892262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.892278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.892287 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.995983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.996078 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.996093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.996114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:53 crc kubenswrapper[4830]: I1203 22:05:53.996127 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:53Z","lastTransitionTime":"2025-12-03T22:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.099875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.099961 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.099985 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.100023 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.100047 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.203713 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.203775 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.203791 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.203815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.203834 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.306893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.306965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.306992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.307023 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.307048 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.336652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.336658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:54 crc kubenswrapper[4830]: E1203 22:05:54.336980 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:54 crc kubenswrapper[4830]: E1203 22:05:54.337141 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.336662 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:54 crc kubenswrapper[4830]: E1203 22:05:54.337386 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.410340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.410389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.410407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.410427 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.410442 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.514411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.514486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.514549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.514606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.514630 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.617687 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.617746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.617764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.617787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.617808 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.720222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.720259 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.720271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.720287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.720298 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.823449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.823593 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.823623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.823654 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.823676 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.925917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.925975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.926002 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.926020 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:54 crc kubenswrapper[4830]: I1203 22:05:54.926033 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:54Z","lastTransitionTime":"2025-12-03T22:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.028235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.028273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.028283 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.028296 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.028343 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.130463 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.130567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.130576 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.130588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.130597 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.232596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.232657 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.232683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.232714 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.232738 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.335249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.335321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.335346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.335372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.335387 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.336020 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:55 crc kubenswrapper[4830]: E1203 22:05:55.336185 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.437981 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.438043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.438062 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.438088 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.438106 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.541013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.541051 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.541070 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.541091 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.541105 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.644216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.644262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.644271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.644287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.644299 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.748018 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.748090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.748109 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.748137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.748156 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.851341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.851425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.851449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.851484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.851537 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.954586 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.955392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.955469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.955575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:55 crc kubenswrapper[4830]: I1203 22:05:55.955645 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:55Z","lastTransitionTime":"2025-12-03T22:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.076879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.076917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.076927 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.076942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.076952 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.179247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.179319 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.179341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.179374 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.179400 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.282745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.282823 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.282841 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.282869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.282888 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.336761 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.336836 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.336776 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:56 crc kubenswrapper[4830]: E1203 22:05:56.336939 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:56 crc kubenswrapper[4830]: E1203 22:05:56.337096 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:56 crc kubenswrapper[4830]: E1203 22:05:56.337201 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.386384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.386604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.386630 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.386660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.386680 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.489911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.489964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.489980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.490006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.490024 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.593157 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.593213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.593230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.593253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.593270 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.696674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.696764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.696785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.696808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.696825 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.800253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.800327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.800352 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.800381 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.800403 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.903568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.903642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.903664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.903692 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:56 crc kubenswrapper[4830]: I1203 22:05:56.903709 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:56Z","lastTransitionTime":"2025-12-03T22:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.006830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.006893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.006909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.006936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.006953 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.109985 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.110046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.110062 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.110086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.110103 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.213341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.213421 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.213449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.213484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.213502 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.316381 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.316445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.316464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.316494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.316561 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.320152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.320269 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.320296 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.320325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.320348 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.336883 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.337084 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.351663 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:57Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.355920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.355981 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.356000 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.356024 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.356040 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.376245 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:57Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.380849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.381049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.381190 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.381388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.381596 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.401355 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:57Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.405455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.405688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.405835 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.405981 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.406141 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.428937 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:57Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.434182 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.434463 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.434722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.434911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.435060 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.448695 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:57Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:57 crc kubenswrapper[4830]: E1203 22:05:57.448843 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.450836 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.450879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.450891 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.450907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.450918 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.553469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.553575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.553596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.553622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.553640 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.656095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.656163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.656204 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.656232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.656250 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.759487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.759929 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.760077 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.760339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.760410 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.863213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.863616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.863732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.863832 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.863929 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.967719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.967795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.967819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.967851 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:57 crc kubenswrapper[4830]: I1203 22:05:57.967873 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:57Z","lastTransitionTime":"2025-12-03T22:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.071633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.071696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.071712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.071736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.071755 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.175007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.175110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.175128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.175161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.175178 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.278141 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.278203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.278223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.278245 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.278263 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.336361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.336416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.336488 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:05:58 crc kubenswrapper[4830]: E1203 22:05:58.336663 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:05:58 crc kubenswrapper[4830]: E1203 22:05:58.336811 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:05:58 crc kubenswrapper[4830]: E1203 22:05:58.336983 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.381293 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.381342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.381359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.381384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.381405 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.484465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.484554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.484580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.484610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.484630 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.621860 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.621940 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.621964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.621992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.622041 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.724696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.724761 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.724801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.724834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.724858 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.828242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.828307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.828326 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.828350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.828368 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.931648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.931727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.931746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.931777 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:58 crc kubenswrapper[4830]: I1203 22:05:58.931815 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:58Z","lastTransitionTime":"2025-12-03T22:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.035854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.035917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.035939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.035964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.035983 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.143764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.143849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.143874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.143905 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.143928 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.246421 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.246480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.246490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.246544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.246558 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.336869 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:05:59 crc kubenswrapper[4830]: E1203 22:05:59.337118 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.348407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.348452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.348464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.348482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.348493 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.451569 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.453236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.453292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.453308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.453468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.453490 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.462459 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.491162 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.515046 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.538080 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.557833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.558040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.558148 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.558288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.558395 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.561036 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.579804 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.614164 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.642321 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.664044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.664119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.664139 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.664174 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.664199 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.669142 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.693648 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.720629 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.744822 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.762985 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.767686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.768547 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.768800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.769193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.769715 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.779288 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.794196 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.812493 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.828571 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.843797 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:59Z is after 2025-08-24T17:21:41Z" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.873665 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.873742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.873756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.873809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.873826 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.977342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.977392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.977410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.977435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:05:59 crc kubenswrapper[4830]: I1203 22:05:59.977448 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:05:59Z","lastTransitionTime":"2025-12-03T22:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.079577 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.079628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.079638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.079657 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.079669 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.182852 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.182888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.182896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.182908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.182917 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.286013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.286084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.286106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.286137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.286159 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.336335 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.336391 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.336346 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:00 crc kubenswrapper[4830]: E1203 22:06:00.336566 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:00 crc kubenswrapper[4830]: E1203 22:06:00.336696 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:00 crc kubenswrapper[4830]: E1203 22:06:00.336827 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.390718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.390778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.390797 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.390823 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.390843 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.493841 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.493919 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.493938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.493976 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.493995 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.597350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.597409 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.597419 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.597443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.597456 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.711103 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.711161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.711172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.711194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.711209 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.815799 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.815880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.815905 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.815939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.815960 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.920486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.920588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.920607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.920631 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:00 crc kubenswrapper[4830]: I1203 22:06:00.920648 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:00Z","lastTransitionTime":"2025-12-03T22:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.024672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.024756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.024772 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.024800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.024818 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.127811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.127904 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.127935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.127969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.127992 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.178660 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:01 crc kubenswrapper[4830]: E1203 22:06:01.178905 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:01 crc kubenswrapper[4830]: E1203 22:06:01.179024 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:17.178995245 +0000 UTC m=+66.175456594 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.230570 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.230610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.230620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.230637 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.230648 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.334596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.334670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.334696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.334727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.334747 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.338558 4830 scope.go:117] "RemoveContainer" containerID="bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.339128 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:01 crc kubenswrapper[4830]: E1203 22:06:01.339486 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.355138 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.375128 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.395897 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.415098 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.437330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.437379 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.437396 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.437425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.437442 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.438213 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.472739 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.496000 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.516361 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.538474 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.539443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.539530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.539544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.539562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.539577 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.559223 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.593078 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.620559 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.638936 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.644435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.644555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.644726 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.644773 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.644813 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.660952 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.676827 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.691800 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.710028 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.728000 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.731687 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/1.log" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.734629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.735210 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.748601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.748632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.748643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.748656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.748669 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.755057 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.772321 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.796135 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.811890 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.829632 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.852282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.852338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.852356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.852386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.852406 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.865029 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.880581 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.919581 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.942894 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.954920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.954983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.954997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.955014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.955027 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:01Z","lastTransitionTime":"2025-12-03T22:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.958019 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:01 crc kubenswrapper[4830]: I1203 22:06:01.985218 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:01Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.002878 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.015723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.029486 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.050583 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.057059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.057085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.057095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.057110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.057118 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.066255 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.080318 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.092783 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.160248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.160307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.160325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.160351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.160371 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.213650 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.213931 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214011 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:06:34.213955866 +0000 UTC m=+83.210417265 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214126 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.214128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214155 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214177 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214271 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:34.214246073 +0000 UTC m=+83.210707432 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.214300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214339 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214412 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214428 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214439 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214447 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:34.214419977 +0000 UTC m=+83.210881476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.214354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214470 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:34.214461118 +0000 UTC m=+83.210922487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214619 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.214729 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:34.214706344 +0000 UTC m=+83.211167863 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.264501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.264559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.264571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.264587 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.264598 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.336741 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.336815 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.336845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.336885 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.337022 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.337279 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.367088 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.367186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.367203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.367229 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.367248 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.470340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.470392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.470406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.470424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.470436 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.574231 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.574299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.574316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.574342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.574366 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.677116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.677151 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.677160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.677175 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.677184 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.739775 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/2.log" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.740791 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/1.log" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.745624 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" exitCode=1 Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.745753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.745865 4830 scope.go:117] "RemoveContainer" containerID="bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.747098 4830 scope.go:117] "RemoveContainer" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" Dec 03 22:06:02 crc kubenswrapper[4830]: E1203 22:06:02.747451 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.770955 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.784792 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.784845 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.785017 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.785028 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.785051 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.785075 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.801812 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.818852 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.840013 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.856224 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.878701 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb390fae0df2effdc30df645405a669219d3be009cded90ddce405524dcba3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"2:05:44.739101 6254 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:05:44Z is after 2025-08-24T17:21:41Z]\\\\nI1203 22:05:44.738940 6254 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]strin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.887966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.888014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.888026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.888044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.888055 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.895457 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.910800 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.925318 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.940293 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.959826 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.983594 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.991147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.991222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.991248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.991278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:02 crc kubenswrapper[4830]: I1203 22:06:02.991305 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:02Z","lastTransitionTime":"2025-12-03T22:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.001163 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:02Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.020808 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.052393 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.073231 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.092797 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.093771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.093819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.093836 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.093862 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.093880 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.196693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.196781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.196805 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.196839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.196862 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.299555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.299610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.299627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.299649 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.299666 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.336698 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:03 crc kubenswrapper[4830]: E1203 22:06:03.336914 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.403080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.403212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.403258 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.403283 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.403296 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.506376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.506432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.506446 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.506469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.506483 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.609719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.609790 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.609816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.609852 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.609872 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.712542 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.712620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.712642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.712673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.712697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.750603 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/2.log" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.756266 4830 scope.go:117] "RemoveContainer" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" Dec 03 22:06:03 crc kubenswrapper[4830]: E1203 22:06:03.758951 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.777238 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.801436 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.815649 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.815690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.815701 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.815719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.815731 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.822353 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.837706 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.852608 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.875844 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.900943 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.919495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.919571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.919586 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.919605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.919623 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:03Z","lastTransitionTime":"2025-12-03T22:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.923030 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.936431 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.947102 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.959754 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.970429 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.982100 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:03 crc kubenswrapper[4830]: I1203 22:06:03.996399 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:03Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.011928 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:04Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.022305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.022340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.022351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.022370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.022382 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.036018 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:04Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.050504 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:04Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.063709 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:04Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.125267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.125580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.125592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.125608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.125621 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.228754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.228815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.228833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.228857 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.228874 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.331677 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.331739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.331769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.331798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.331814 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.336431 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.336453 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:04 crc kubenswrapper[4830]: E1203 22:06:04.336609 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.336641 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:04 crc kubenswrapper[4830]: E1203 22:06:04.336774 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:04 crc kubenswrapper[4830]: E1203 22:06:04.336892 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.435044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.435125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.435160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.435191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.435212 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.538268 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.538327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.538346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.538369 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.538386 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.640884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.640913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.640921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.640934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.640943 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.744029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.744102 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.744132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.744177 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.744203 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.847612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.847672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.847688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.847712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.847731 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.950895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.950954 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.950974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.950997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:04 crc kubenswrapper[4830]: I1203 22:06:04.951013 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:04Z","lastTransitionTime":"2025-12-03T22:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.053604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.053642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.053653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.053669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.053680 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.157263 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.157820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.157888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.157924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.157963 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.262038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.262105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.262129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.262158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.262179 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.336447 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:05 crc kubenswrapper[4830]: E1203 22:06:05.336694 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.365390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.365748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.365902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.366034 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.366173 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.468648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.468770 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.468795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.468824 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.468846 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.571984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.572061 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.572085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.572112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.572129 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.673816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.673885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.673908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.673937 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.673960 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.777211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.777288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.777321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.777349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.777370 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.880256 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.880315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.880332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.880356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.880373 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.983124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.983185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.983206 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.983237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:05 crc kubenswrapper[4830]: I1203 22:06:05.983258 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:05Z","lastTransitionTime":"2025-12-03T22:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.086461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.086563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.086588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.086611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.086628 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.189874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.189936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.189958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.189986 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.190008 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.293116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.293195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.293221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.293251 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.293275 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.336751 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.336834 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.336752 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:06 crc kubenswrapper[4830]: E1203 22:06:06.336961 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:06 crc kubenswrapper[4830]: E1203 22:06:06.337124 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:06 crc kubenswrapper[4830]: E1203 22:06:06.337246 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.395595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.395653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.395664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.395682 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.395692 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.498637 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.498687 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.498710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.498739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.498761 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.602064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.602146 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.602167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.602191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.602209 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.705939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.706030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.706056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.706118 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.706145 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.810353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.810456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.810603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.810685 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.810723 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.914600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.914679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.914696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.914719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:06 crc kubenswrapper[4830]: I1203 22:06:06.914734 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:06Z","lastTransitionTime":"2025-12-03T22:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.018026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.018080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.018097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.018122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.018139 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.121831 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.121923 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.121947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.121983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.122027 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.225451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.225552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.225579 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.225608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.225630 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.328843 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.328897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.328915 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.328939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.328958 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.336625 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.336812 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.432702 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.432758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.432776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.432801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.432822 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.535880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.535952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.535969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.535994 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.536012 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.568200 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.568300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.568321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.568733 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.568967 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.591395 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:07Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.598277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.598355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.598380 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.598411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.598432 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.621347 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:07Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.627221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.627277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.627297 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.627323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.627340 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.648021 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:07Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.654114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.654177 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.654194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.654223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.654244 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.675375 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:07Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.680452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.680538 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.680559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.680583 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.680600 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.701592 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:07Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:07 crc kubenswrapper[4830]: E1203 22:06:07.701842 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.705066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.705094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.705104 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.705119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.705131 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.807975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.808068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.808129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.808160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.808225 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.911484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.911606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.911635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.911670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:07 crc kubenswrapper[4830]: I1203 22:06:07.911723 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:07Z","lastTransitionTime":"2025-12-03T22:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.014439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.014532 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.014550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.014578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.014599 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.117897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.117952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.117970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.117993 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.118009 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.220315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.220368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.220385 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.220408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.220425 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.323050 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.323130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.323148 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.323176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.323193 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.336382 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.336503 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:08 crc kubenswrapper[4830]: E1203 22:06:08.336751 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.336960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:08 crc kubenswrapper[4830]: E1203 22:06:08.337095 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:08 crc kubenswrapper[4830]: E1203 22:06:08.337259 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.425860 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.425946 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.425967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.426001 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.426022 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.528820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.528897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.528924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.528948 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.528964 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.631644 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.631704 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.631723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.631747 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.631765 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.734939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.734990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.735009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.735030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.735045 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.838050 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.838114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.838168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.838208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.838236 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.941486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.941595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.941617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.941646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:08 crc kubenswrapper[4830]: I1203 22:06:08.941663 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:08Z","lastTransitionTime":"2025-12-03T22:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.044641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.044708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.044729 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.044758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.044781 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.148109 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.148171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.148232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.148259 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.148278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.251829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.251895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.251914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.251937 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.251956 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.336878 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:09 crc kubenswrapper[4830]: E1203 22:06:09.337051 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.355473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.355575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.355597 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.355623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.355642 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.459087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.459158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.459185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.459215 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.459236 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.562390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.562455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.562477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.562578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.562607 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.665270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.665325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.665341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.665367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.665384 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.767930 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.767988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.768005 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.768034 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.768057 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.871014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.871089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.871130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.871155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.871180 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.973853 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.973924 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.973943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.973968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:09 crc kubenswrapper[4830]: I1203 22:06:09.973987 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:09Z","lastTransitionTime":"2025-12-03T22:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.077671 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.077773 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.077785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.077809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.077823 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.182073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.182166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.182189 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.182222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.182244 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.285568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.285625 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.285643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.285669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.285688 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.336983 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.337070 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.337155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:10 crc kubenswrapper[4830]: E1203 22:06:10.337283 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:10 crc kubenswrapper[4830]: E1203 22:06:10.338063 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:10 crc kubenswrapper[4830]: E1203 22:06:10.338130 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.389163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.389211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.389225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.389247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.389262 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.492847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.492887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.492896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.492913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.492923 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.595918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.595995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.596010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.596029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.596041 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.699178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.699247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.699266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.699292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.699310 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.803225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.803291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.803309 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.803334 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.803352 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.906172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.906233 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.906250 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.906273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:10 crc kubenswrapper[4830]: I1203 22:06:10.906292 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:10Z","lastTransitionTime":"2025-12-03T22:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.009710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.009769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.009785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.009811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.009828 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.112884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.112952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.112966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.112989 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.113006 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.216225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.216310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.216324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.216344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.216361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.318434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.318481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.318490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.318526 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.318537 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.336031 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:11 crc kubenswrapper[4830]: E1203 22:06:11.337104 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.368445 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.381289 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.394819 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.411414 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.421493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.421548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.421560 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.421582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.421595 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.432492 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.451790 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.468380 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.496458 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.515173 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.525974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.526024 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.526038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.526066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.526081 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.532898 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.551846 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.568404 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.585633 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.607286 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.625239 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.629500 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.629562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.629573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.629592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.629604 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.644578 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.658557 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.682764 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:11Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.732411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.732461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.732479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.732538 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.732564 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.835787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.835861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.835871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.835892 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.835906 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.939098 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.939150 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.939160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.939179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:11 crc kubenswrapper[4830]: I1203 22:06:11.939190 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:11Z","lastTransitionTime":"2025-12-03T22:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.042612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.042693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.042711 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.042742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.042765 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.145745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.145925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.145952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.146061 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.146082 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.249212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.249287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.249299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.249342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.249358 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.336356 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.336396 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.336445 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:12 crc kubenswrapper[4830]: E1203 22:06:12.336584 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:12 crc kubenswrapper[4830]: E1203 22:06:12.336728 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:12 crc kubenswrapper[4830]: E1203 22:06:12.336962 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.352565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.352630 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.352654 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.352687 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.352710 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.455899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.455938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.455949 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.455964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.455977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.558980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.559054 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.559064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.559088 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.559105 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.661240 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.661330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.661353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.661388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.661409 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.764425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.764491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.764534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.764561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.764578 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.867814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.867890 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.867913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.867942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.867966 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.971224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.971303 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.971323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.971348 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:12 crc kubenswrapper[4830]: I1203 22:06:12.971365 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:12Z","lastTransitionTime":"2025-12-03T22:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.074371 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.074440 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.074456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.074481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.074498 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.177918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.177982 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.178005 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.178034 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.178065 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.280748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.280800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.280819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.280847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.280864 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.336920 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:13 crc kubenswrapper[4830]: E1203 22:06:13.337120 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.383213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.383278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.383292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.383310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.383322 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.486207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.486249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.486276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.486291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.486300 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.588868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.588925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.588933 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.588947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.588974 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.692002 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.692060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.692071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.692087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.692095 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.794402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.794449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.794466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.794489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.794538 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.897102 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.897168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.897277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.897349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:13 crc kubenswrapper[4830]: I1203 22:06:13.897372 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:13Z","lastTransitionTime":"2025-12-03T22:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.000077 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.000119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.000127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.000140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.000151 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.103290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.103344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.103361 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.103419 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.103438 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.205872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.205930 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.205947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.205971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.205987 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.309159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.309217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.309234 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.309258 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.309275 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.336832 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.336863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.336852 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:14 crc kubenswrapper[4830]: E1203 22:06:14.337032 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:14 crc kubenswrapper[4830]: E1203 22:06:14.337240 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:14 crc kubenswrapper[4830]: E1203 22:06:14.337319 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.413571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.413645 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.413666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.413694 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.413715 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.516984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.517026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.517037 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.517057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.517070 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.620235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.620280 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.620291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.620311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.620324 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.722871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.722936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.722956 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.722983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.723004 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.826791 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.826858 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.826879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.826908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.826929 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.929775 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.929862 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.929886 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.929918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:14 crc kubenswrapper[4830]: I1203 22:06:14.929942 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:14Z","lastTransitionTime":"2025-12-03T22:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.050216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.050261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.050275 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.050293 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.050308 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.153077 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.153115 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.153126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.153140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.153148 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.257009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.257040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.257049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.257063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.257075 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.336077 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:15 crc kubenswrapper[4830]: E1203 22:06:15.336285 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.338112 4830 scope.go:117] "RemoveContainer" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" Dec 03 22:06:15 crc kubenswrapper[4830]: E1203 22:06:15.338752 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.358979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.359001 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.359010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.359023 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.359032 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.466736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.466813 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.466840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.466870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.466893 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.569778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.569822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.569835 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.569853 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.569863 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.672294 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.672334 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.672346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.672362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.672371 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.775358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.775410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.775423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.775444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.775455 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.878171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.878245 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.878263 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.878285 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.878299 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.981078 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.981119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.981132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.981152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:15 crc kubenswrapper[4830]: I1203 22:06:15.981167 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:15Z","lastTransitionTime":"2025-12-03T22:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.082990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.083027 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.083040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.083056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.083068 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.187173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.187203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.187211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.187224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.187232 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.289539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.289603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.289618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.289642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.289656 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.336164 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.336288 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:16 crc kubenswrapper[4830]: E1203 22:06:16.336364 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.336400 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:16 crc kubenswrapper[4830]: E1203 22:06:16.336489 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:16 crc kubenswrapper[4830]: E1203 22:06:16.336671 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.392984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.393033 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.393047 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.393101 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.393113 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.495554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.495604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.495614 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.495633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.495643 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.599046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.599105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.599124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.599149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.599166 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.702184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.702226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.702239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.702257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.702270 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.804826 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.804876 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.804890 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.804909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.804923 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.907686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.907740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.907757 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.907781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:16 crc kubenswrapper[4830]: I1203 22:06:16.907799 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:16Z","lastTransitionTime":"2025-12-03T22:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.010382 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.010450 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.010465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.010491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.010538 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.112868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.112931 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.112949 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.112974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.112991 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.180829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:17 crc kubenswrapper[4830]: E1203 22:06:17.181046 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:17 crc kubenswrapper[4830]: E1203 22:06:17.181162 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:06:49.181133995 +0000 UTC m=+98.177595384 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.215561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.215627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.215650 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.215679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.215701 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.321240 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.321328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.321345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.321368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.321387 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.336691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:17 crc kubenswrapper[4830]: E1203 22:06:17.336879 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.426041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.426135 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.426154 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.426186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.426237 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.529286 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.529334 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.529345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.529362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.529373 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.633027 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.633373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.633389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.633408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.633424 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.739955 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.740007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.740032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.740051 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.740066 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.807785 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/0.log" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.807865 4830 generic.go:334] "Generic (PLEG): container finished" podID="bdccedf8-f580-49f0-848e-108c748d8a21" containerID="7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98" exitCode=1 Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.807920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerDied","Data":"7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.808553 4830 scope.go:117] "RemoveContainer" containerID="7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.827998 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.838945 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.842623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.842676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.842712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.842862 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.842893 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.852126 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.864795 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.876261 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.886830 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.903186 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.915538 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.927878 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.935314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.935387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.935402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.935422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.935434 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.943399 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: E1203 22:06:17.948267 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.954615 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.954655 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.954689 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.954706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.954717 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.955351 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: E1203 22:06:17.976139 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.994279 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:17Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.995499 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.995555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.995566 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.995590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:17 crc kubenswrapper[4830]: I1203 22:06:17.995601 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:17Z","lastTransitionTime":"2025-12-03T22:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.010932 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.015235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.015298 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.015308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.015330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.015343 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.033785 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.035665 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.039332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.039370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.039385 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.039408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.039425 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.049087 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.052479 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.052618 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.054276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.054311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.054324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.054335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.054344 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.070648 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.083832 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.094977 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.105204 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.157083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.157117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.157129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.157145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.157157 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.259749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.259812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.259829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.259847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.259860 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.336854 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.337082 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.337456 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.337662 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.337720 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:18 crc kubenswrapper[4830]: E1203 22:06:18.337839 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.361967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.362029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.362054 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.362081 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.362101 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.464293 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.464328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.464338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.464351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.464361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.566421 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.566480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.566497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.566552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.566569 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.669159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.669220 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.669237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.669261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.669278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.771749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.771798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.771813 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.771831 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.771844 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.814691 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/0.log" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.814767 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerStarted","Data":"d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.832865 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.845809 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.858477 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.871949 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.874641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.874753 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.874779 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.874808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.874833 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.882577 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.894648 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.904107 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.915021 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.933726 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.950180 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.963618 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.977942 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.978951 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.978983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.978991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.979004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.979012 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:18Z","lastTransitionTime":"2025-12-03T22:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:18 crc kubenswrapper[4830]: I1203 22:06:18.989717 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:18Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.006570 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:19Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.022894 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:19Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.051051 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:19Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.070480 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:19Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.080933 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.080979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.080990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.081006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.081016 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.084012 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:19Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.183608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.183648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.183657 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.183672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.183681 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.287168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.287216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.287228 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.287248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.287263 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.336394 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:19 crc kubenswrapper[4830]: E1203 22:06:19.336591 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.389921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.389961 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.389971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.389987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.389996 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.492273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.492317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.492326 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.492341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.492350 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.594844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.594898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.594909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.594926 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.594937 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.698403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.698451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.698464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.698482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.698499 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.801386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.801475 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.801497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.801556 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.801576 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.903488 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.903584 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.903603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.903629 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:19 crc kubenswrapper[4830]: I1203 22:06:19.903647 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:19Z","lastTransitionTime":"2025-12-03T22:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.006094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.006160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.006179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.006204 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.006221 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.108649 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.109065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.109081 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.109100 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.109112 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.211599 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.211657 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.211666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.211681 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.211689 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.313899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.313942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.313950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.313965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.313974 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.336756 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.336796 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.336845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:20 crc kubenswrapper[4830]: E1203 22:06:20.336924 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:20 crc kubenswrapper[4830]: E1203 22:06:20.336994 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:20 crc kubenswrapper[4830]: E1203 22:06:20.337051 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.417399 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.417449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.417466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.417489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.417532 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.521229 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.521301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.521317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.521342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.521359 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.623589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.623644 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.623656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.623675 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.623689 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.726027 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.726091 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.726109 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.726133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.726154 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.828734 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.828778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.828792 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.828808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.828818 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.931972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.932014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.932025 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.932040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:20 crc kubenswrapper[4830]: I1203 22:06:20.932049 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:20Z","lastTransitionTime":"2025-12-03T22:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.035137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.035193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.035225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.035243 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.035255 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.138236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.138297 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.138314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.138339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.138357 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.240475 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.240605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.240630 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.240664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.240692 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.335978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:21 crc kubenswrapper[4830]: E1203 22:06:21.336110 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.342450 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.342530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.342549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.342571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.342587 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.356100 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.374665 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.388479 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.409500 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.432161 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.444212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.444246 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.444255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.444269 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.444282 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.453090 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.468780 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.482205 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.514371 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.533238 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.547438 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.547603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.547632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.547854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.547903 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.549545 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.567805 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.597093 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.621421 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.638942 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.655472 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.655580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.655621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.655656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.655681 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.663591 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.683050 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.697113 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:21Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.757819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.757875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.757892 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.757916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.757934 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.859713 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.859790 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.859810 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.859835 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.859851 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.963111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.963153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.963163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.963179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:21 crc kubenswrapper[4830]: I1203 22:06:21.963188 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:21Z","lastTransitionTime":"2025-12-03T22:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.064902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.064940 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.064952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.064968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.064977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.167942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.168058 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.168080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.168138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.168166 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.270877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.270908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.270919 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.270934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.270944 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.336931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.337168 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:22 crc kubenswrapper[4830]: E1203 22:06:22.337264 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.336931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:22 crc kubenswrapper[4830]: E1203 22:06:22.337371 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:22 crc kubenswrapper[4830]: E1203 22:06:22.337459 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.374292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.374330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.374339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.374353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.374366 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.477701 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.477740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.477751 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.477765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.477775 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.580332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.580420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.580437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.580461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.580478 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.683030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.683105 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.683128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.683156 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.683181 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.785735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.785770 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.785780 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.785798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.785808 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.888282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.888378 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.888404 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.888436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.888457 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.990776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.990812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.990822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.990840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:22 crc kubenswrapper[4830]: I1203 22:06:22.990850 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:22Z","lastTransitionTime":"2025-12-03T22:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.093884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.093985 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.094009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.094036 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.094058 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.196054 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.196108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.196123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.196138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.196150 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.299134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.299183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.299201 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.299229 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.299252 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.336062 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:23 crc kubenswrapper[4830]: E1203 22:06:23.336284 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.401813 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.401863 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.401880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.401906 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.401925 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.504605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.504664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.504686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.504711 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.504732 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.607271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.607340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.607364 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.607392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.607414 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.709912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.709944 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.709974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.709991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.710001 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.811952 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.811987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.811998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.812038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.812048 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.914769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.914834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.914849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.914874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:23 crc kubenswrapper[4830]: I1203 22:06:23.914892 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:23Z","lastTransitionTime":"2025-12-03T22:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.018452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.018519 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.018529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.018544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.018554 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.121362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.121440 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.121457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.121485 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.121502 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.224004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.224057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.224068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.224086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.224098 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.326921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.326990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.327010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.327036 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.327053 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.336215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.336266 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.336340 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:24 crc kubenswrapper[4830]: E1203 22:06:24.336446 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:24 crc kubenswrapper[4830]: E1203 22:06:24.336537 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:24 crc kubenswrapper[4830]: E1203 22:06:24.336782 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.348400 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.430297 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.430360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.430376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.430403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.430426 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.533911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.533967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.533984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.534009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.534027 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.641473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.641578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.641602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.641636 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.641664 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.744167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.744562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.744581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.744606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.744622 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.846785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.846823 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.846834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.846850 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.846861 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.950404 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.950455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.950467 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.950482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:24 crc kubenswrapper[4830]: I1203 22:06:24.950492 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:24Z","lastTransitionTime":"2025-12-03T22:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.053873 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.053947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.053971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.054002 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.054029 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.156236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.156296 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.156312 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.156335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.156354 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.258676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.258737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.258755 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.258779 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.258797 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.336953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:25 crc kubenswrapper[4830]: E1203 22:06:25.337139 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.360983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.361082 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.361100 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.361125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.361142 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.463895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.463969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.463987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.464012 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.464029 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.565695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.565748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.565764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.565783 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.565799 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.669046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.669304 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.669329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.669353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.669370 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.772114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.772164 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.772176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.772194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.772208 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.873970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.874032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.874049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.874072 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.874087 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.977309 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.977377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.977400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.977430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:25 crc kubenswrapper[4830]: I1203 22:06:25.977452 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:25Z","lastTransitionTime":"2025-12-03T22:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.080495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.080589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.080627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.080658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.080680 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.184212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.184279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.184303 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.184333 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.184356 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.287284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.287426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.287452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.287481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.287500 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.336339 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.336434 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.336348 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:26 crc kubenswrapper[4830]: E1203 22:06:26.336539 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:26 crc kubenswrapper[4830]: E1203 22:06:26.336690 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:26 crc kubenswrapper[4830]: E1203 22:06:26.336859 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.391638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.391698 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.391757 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.391789 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.391805 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.499593 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.499652 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.499668 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.500037 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.500060 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.603076 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.603142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.603168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.603192 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.603210 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.705692 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.705744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.705762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.705786 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.705802 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.808851 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.808927 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.808945 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.808972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.808992 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.911770 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.911830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.911846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.911869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:26 crc kubenswrapper[4830]: I1203 22:06:26.911886 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:26Z","lastTransitionTime":"2025-12-03T22:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.015051 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.015096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.015112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.015133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.015149 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.118041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.118111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.118129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.118151 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.118168 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.222572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.222632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.222648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.222678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.222695 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.326012 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.326119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.326138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.326163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.326181 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.336489 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:27 crc kubenswrapper[4830]: E1203 22:06:27.336752 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.337864 4830 scope.go:117] "RemoveContainer" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.429085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.429119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.429130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.429147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.429159 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.532097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.532160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.532178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.532203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.532222 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.635567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.635665 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.635686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.635715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.635733 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.739361 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.739443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.739467 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.739498 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.739574 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.843060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.843137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.843156 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.843185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.843212 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.946346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.946408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.946426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.946451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:27 crc kubenswrapper[4830]: I1203 22:06:27.946467 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:27Z","lastTransitionTime":"2025-12-03T22:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.049384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.049426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.049437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.049455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.049465 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.139497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.139611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.139640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.139669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.139691 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.161007 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.169063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.169122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.169140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.169167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.169187 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.182910 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.186396 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.186430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.186444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.186459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.186471 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.204693 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.211402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.211442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.211451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.211467 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.211476 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.231257 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.236152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.236193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.236206 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.236224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.236237 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.250881 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.251034 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.253016 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.253053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.253065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.253081 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.253095 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.336558 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.336575 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.336694 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.336756 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.336707 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:28 crc kubenswrapper[4830]: E1203 22:06:28.336964 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.355805 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.355853 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.355864 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.355881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.355894 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.459354 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.459417 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.459429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.459452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.459468 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.562085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.562142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.562158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.562180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.562194 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.664977 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.665043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.665063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.665117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.665135 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.768040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.768090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.768106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.768126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.768138 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.858944 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/2.log" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.863786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.864591 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.870358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.870430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.870457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.870485 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.870540 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.901503 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.927600 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.943207 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.966582 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.973503 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.973948 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.974213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.974448 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.974760 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:28Z","lastTransitionTime":"2025-12-03T22:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:28 crc kubenswrapper[4830]: I1203 22:06:28.985371 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.001454 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:28Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.025676 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.043314 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.067917 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.077722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.077769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.077782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.077800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.077812 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.089067 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.120741 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.140635 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.157026 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.172713 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.181214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.181284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.181313 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.181346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.181370 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.192833 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.211304 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.232164 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.249475 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.272723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.284305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.284349 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.284362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.284381 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.284394 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.336572 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:29 crc kubenswrapper[4830]: E1203 22:06:29.336765 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.388162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.388239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.388257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.388282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.388300 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.491554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.491637 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.491662 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.491693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.491715 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.594837 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.594902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.594920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.594945 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.594963 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.697709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.697750 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.697758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.697771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.697780 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.800436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.800489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.800500 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.800532 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.800541 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.870292 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/3.log" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.871225 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/2.log" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.875090 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" exitCode=1 Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.875149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.875199 4830 scope.go:117] "RemoveContainer" containerID="0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.876351 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:06:29 crc kubenswrapper[4830]: E1203 22:06:29.876676 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.902901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.902974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.902996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.903026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.903049 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:29Z","lastTransitionTime":"2025-12-03T22:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.918660 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.938243 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.954266 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.970381 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.980123 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:29 crc kubenswrapper[4830]: I1203 22:06:29.993148 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:29Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.005463 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.005493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.005522 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.005539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.005550 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.015306 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.033823 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.051831 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.066125 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.098552 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0daa45061774072658c990b620d2f7d62c4329c1d7758df00069b84cc24dc120\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:02Z\\\",\\\"message\\\":\\\":141\\\\nI1203 22:06:02.385192 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 22:06:02.385300 6448 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385474 6448 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.385625 6448 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386269 6448 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:02.386278 6448 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:02.386296 6448 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:02.386303 6448 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 22:06:02.386339 6448 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:02.386370 6448 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:02.386388 6448 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 22:06:02.386373 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 22:06:02.386754 6448 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"203 22:06:28.689994 6805 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 22:06:28.690024 6805 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 22:06:28.690053 6805 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 22:06:28.690061 6805 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 22:06:28.690087 6805 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 22:06:28.690118 6805 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 22:06:28.690123 6805 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:28.690135 6805 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 22:06:28.690143 6805 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 22:06:28.690150 6805 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:28.690156 6805 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:28.690167 6805 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:28.690187 6805 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:28.690166 6805 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:06:28.690247 6805 factory.go:656] Stopping watch factory\\\\nI1203 22:06:28.690279 6805 ovnkube.go:599] Stopped ovnkube\\\\nI1203 22:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.109612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.109655 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.109667 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.109683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.109697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.121362 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.138460 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.152254 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.171886 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.185452 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.199720 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.212546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.212603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.212616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.212635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.212647 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.215791 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.230733 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.316113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.316310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.316337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.316365 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.316389 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.336450 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.336459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.337676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:30 crc kubenswrapper[4830]: E1203 22:06:30.337890 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:30 crc kubenswrapper[4830]: E1203 22:06:30.338061 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:30 crc kubenswrapper[4830]: E1203 22:06:30.338200 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.419315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.419372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.419389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.419412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.419429 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.521494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.521545 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.521555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.521570 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.521584 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.624731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.624855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.624877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.624901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.624918 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.728176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.728260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.728284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.728317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.728338 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.831125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.831202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.831214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.831233 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.831245 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.882303 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/3.log" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.886629 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:06:30 crc kubenswrapper[4830]: E1203 22:06:30.886923 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.906894 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.926656 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.935049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.935128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.935153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.935188 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.935211 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:30Z","lastTransitionTime":"2025-12-03T22:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.946356 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.961298 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:30 crc kubenswrapper[4830]: I1203 22:06:30.981301 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:30Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.018757 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.032337 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.037801 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.037902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.037922 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.037948 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.037967 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.052426 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.071417 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.084593 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.099321 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.119273 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.136308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.140330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.140365 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.140374 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.140389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.140399 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.150273 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.163790 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.188918 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"203 22:06:28.689994 6805 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 22:06:28.690024 6805 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 22:06:28.690053 6805 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 22:06:28.690061 6805 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 22:06:28.690087 6805 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 22:06:28.690118 6805 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 22:06:28.690123 6805 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:28.690135 6805 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 22:06:28.690143 6805 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 22:06:28.690150 6805 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:28.690156 6805 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:28.690167 6805 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:28.690187 6805 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:28.690166 6805 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:06:28.690247 6805 factory.go:656] Stopping watch factory\\\\nI1203 22:06:28.690279 6805 ovnkube.go:599] Stopped ovnkube\\\\nI1203 22:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.204887 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.218033 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.230579 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.242358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.242409 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.242429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.242451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.242467 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.336908 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:31 crc kubenswrapper[4830]: E1203 22:06:31.337149 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.346345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.346430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.346459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.346494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.346545 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.354550 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.377935 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.395886 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.418366 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.435184 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.449235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.449285 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.449307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.449336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.449356 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.457792 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.479132 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.498693 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.512756 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.534118 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.548111 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.552430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.552546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.552565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.552589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.552610 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.566047 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.583730 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.602498 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.634359 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"203 22:06:28.689994 6805 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 22:06:28.690024 6805 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 22:06:28.690053 6805 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 22:06:28.690061 6805 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 22:06:28.690087 6805 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 22:06:28.690118 6805 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 22:06:28.690123 6805 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:28.690135 6805 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 22:06:28.690143 6805 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 22:06:28.690150 6805 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:28.690156 6805 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:28.690167 6805 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:28.690187 6805 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:28.690166 6805 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:06:28.690247 6805 factory.go:656] Stopping watch factory\\\\nI1203 22:06:28.690279 6805 ovnkube.go:599] Stopped ovnkube\\\\nI1203 22:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.656009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.656055 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.656072 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.656096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.656112 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.658225 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.671814 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.687373 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.709538 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:31Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.758778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.758838 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.758855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.758881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.758898 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.862050 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.862108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.862175 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.862209 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.862226 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.966292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.966358 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.966375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.966403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:31 crc kubenswrapper[4830]: I1203 22:06:31.966419 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:31Z","lastTransitionTime":"2025-12-03T22:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.068664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.068710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.068746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.068763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.068774 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.171234 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.171726 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.171784 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.171816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.171833 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.275249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.275819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.275951 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.276101 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.276239 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.336282 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.336380 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.336380 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:32 crc kubenswrapper[4830]: E1203 22:06:32.337465 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:32 crc kubenswrapper[4830]: E1203 22:06:32.337688 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:32 crc kubenswrapper[4830]: E1203 22:06:32.337772 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.380086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.380161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.380187 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.380217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.380241 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.483922 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.484030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.484046 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.484095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.484114 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.587795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.587855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.587873 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.587898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.587915 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.690764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.690825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.690842 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.690865 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.690882 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.794212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.794287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.794311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.794341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.794363 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.896541 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.896607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.896628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.896656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.896679 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.999480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.999575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.999596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.999623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:32 crc kubenswrapper[4830]: I1203 22:06:32.999644 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:32Z","lastTransitionTime":"2025-12-03T22:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.102820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.102882 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.102902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.102930 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.102949 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.205920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.205992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.206009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.206035 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.206053 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.309960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.310063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.310083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.310138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.310159 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.336078 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:33 crc kubenswrapper[4830]: E1203 22:06:33.336297 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.412647 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.412710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.412727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.412750 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.412769 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.516659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.517056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.517199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.517493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.517705 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.621697 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.621756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.621774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.621799 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.621887 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.725976 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.726042 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.726061 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.726088 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.726108 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.830026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.830097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.830116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.830146 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.830170 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.933214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.933280 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.933301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.933325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:33 crc kubenswrapper[4830]: I1203 22:06:33.933343 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:33Z","lastTransitionTime":"2025-12-03T22:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.037333 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.037407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.037432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.037461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.037484 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.140873 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.140938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.140955 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.140978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.140995 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.244529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.244574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.244583 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.244597 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.244607 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.276186 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276395 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.276334779 +0000 UTC m=+147.272796198 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.276461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.276602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.276664 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276736 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.276749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276754 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276814 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276841 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276855 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276868 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276877 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.276863343 +0000 UTC m=+147.273324712 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276775 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276898 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276929 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.276904964 +0000 UTC m=+147.273366353 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276963 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.276946595 +0000 UTC m=+147.273407984 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.276992 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.276977636 +0000 UTC m=+147.273439025 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.336759 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.336852 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.336913 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.336764 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.337001 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:34 crc kubenswrapper[4830]: E1203 22:06:34.337142 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.348052 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.348125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.348142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.348161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.348176 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.451458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.451562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.451581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.451606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.451623 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.555567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.555634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.555659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.555691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.555716 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.658744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.658849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.658871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.658895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.658912 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.761074 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.761141 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.761202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.761230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.761248 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.864826 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.864908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.864927 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.864950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.864963 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.967490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.967580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.967599 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.967623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:34 crc kubenswrapper[4830]: I1203 22:06:34.967641 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:34Z","lastTransitionTime":"2025-12-03T22:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.070475 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.070542 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.070561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.070585 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.070600 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.174045 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.174087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.174097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.174112 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.174124 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.276342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.276390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.276404 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.276427 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.276442 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.336460 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:35 crc kubenswrapper[4830]: E1203 22:06:35.336738 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.379866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.379928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.379947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.379972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.379990 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.482972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.483018 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.483029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.483048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.483060 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.586162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.586210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.586227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.586250 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.586266 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.690129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.690208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.690230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.690260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.690282 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.792589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.792653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.792676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.792705 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.792727 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.895261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.895308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.895320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.895338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.895348 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.998227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.998293 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.998318 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.998350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:35 crc kubenswrapper[4830]: I1203 22:06:35.998371 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:35Z","lastTransitionTime":"2025-12-03T22:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.101546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.101611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.101634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.101663 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.101685 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.204725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.204794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.204813 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.204837 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.204854 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.307699 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.307761 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.307780 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.307807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.307825 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.336691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.336781 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:36 crc kubenswrapper[4830]: E1203 22:06:36.336848 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:36 crc kubenswrapper[4830]: E1203 22:06:36.336999 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.336781 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:36 crc kubenswrapper[4830]: E1203 22:06:36.337189 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.410476 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.410586 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.410611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.410642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.410665 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.513138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.513203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.513223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.513247 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.513266 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.615720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.615787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.615812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.615838 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.615858 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.719238 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.719286 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.719301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.719321 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.719335 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.822254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.822325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.822350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.822379 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.822400 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.925180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.925237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.925252 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.925276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:36 crc kubenswrapper[4830]: I1203 22:06:36.925293 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:36Z","lastTransitionTime":"2025-12-03T22:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.028473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.028563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.028580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.028605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.028622 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.131990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.132108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.132128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.132155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.132173 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.235284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.235335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.235356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.235386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.235409 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.336899 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:37 crc kubenswrapper[4830]: E1203 22:06:37.337163 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.338907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.338950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.338968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.338988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.339004 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.442589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.442843 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.442864 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.442891 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.442913 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.546169 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.546206 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.546219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.546236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.546247 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.649350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.649394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.649406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.649422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.649434 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.751489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.751529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.751539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.751552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.751561 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.854137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.854183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.854199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.854218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.854232 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.958619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.958775 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.958800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.958872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:37 crc kubenswrapper[4830]: I1203 22:06:37.958898 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:37Z","lastTransitionTime":"2025-12-03T22:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.061782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.061843 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.061867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.061899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.061922 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.164941 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.165347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.165478 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.165656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.165791 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.269451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.269534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.269553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.269576 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.269592 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.336605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.336654 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.336693 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.336770 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.336870 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.337359 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.372290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.372345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.372362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.372387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.372406 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.475122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.475161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.475169 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.475183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.475193 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.527059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.527110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.527127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.527144 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.527156 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.546618 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.556554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.556890 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.557095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.557287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.557469 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.579476 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.585135 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.585191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.585209 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.585232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.585249 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.605022 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.610383 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.610803 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.610897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.610978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.611051 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.629910 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.636551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.636591 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.636603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.636618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.636629 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.655863 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:38Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:38 crc kubenswrapper[4830]: E1203 22:06:38.656180 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.658683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.658872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.659005 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.659127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.659238 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.761764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.761828 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.761847 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.761872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.761890 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.864208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.864544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.864622 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.864694 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.864777 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.967957 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.968220 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.968300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.968416 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:38 crc kubenswrapper[4830]: I1203 22:06:38.968546 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:38Z","lastTransitionTime":"2025-12-03T22:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.071226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.071264 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.071273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.071289 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.071298 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.175138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.175185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.175196 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.175216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.175234 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.278019 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.278088 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.278107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.278136 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.278154 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.336663 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:39 crc kubenswrapper[4830]: E1203 22:06:39.336857 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.380268 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.380324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.380347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.380376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.380402 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.483944 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.483984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.483993 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.484008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.484018 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.587653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.587731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.587755 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.587788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.587814 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.691073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.691134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.691151 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.691176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.691193 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.794479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.794562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.794583 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.794605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.794622 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.898342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.898764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.898910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.899033 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:39 crc kubenswrapper[4830]: I1203 22:06:39.899148 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:39Z","lastTransitionTime":"2025-12-03T22:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.003307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.003377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.003400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.003428 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.003451 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.105899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.105963 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.105984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.106018 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.106043 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.210168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.210239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.210257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.210281 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.210298 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.314427 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.314581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.314600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.314658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.314678 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.336296 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.336357 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:40 crc kubenswrapper[4830]: E1203 22:06:40.336450 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.336366 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:40 crc kubenswrapper[4830]: E1203 22:06:40.336573 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:40 crc kubenswrapper[4830]: E1203 22:06:40.336714 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.417744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.417811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.417832 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.417858 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.417880 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.520360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.520400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.520408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.520423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.520432 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.624087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.624199 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.624218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.624290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.624315 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.730725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.730789 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.730800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.730816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.730833 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.834034 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.834129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.834154 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.834183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.834203 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.936943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.937029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.937050 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.937078 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:40 crc kubenswrapper[4830]: I1203 22:06:40.937105 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:40Z","lastTransitionTime":"2025-12-03T22:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.040403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.040563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.040604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.040641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.040666 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.143689 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.143772 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.143808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.143841 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.143864 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.246658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.246720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.246742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.246766 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.246783 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.336365 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:41 crc kubenswrapper[4830]: E1203 22:06:41.336616 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.349344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.349407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.349426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.349452 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.349469 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.371124 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.391227 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.408954 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.429894 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.451573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.451624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.451641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.451674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.451690 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.452493 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.484701 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"203 22:06:28.689994 6805 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 22:06:28.690024 6805 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 22:06:28.690053 6805 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 22:06:28.690061 6805 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 22:06:28.690087 6805 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 22:06:28.690118 6805 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 22:06:28.690123 6805 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:28.690135 6805 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 22:06:28.690143 6805 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 22:06:28.690150 6805 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:28.690156 6805 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:28.690167 6805 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:28.690187 6805 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:28.690166 6805 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:06:28.690247 6805 factory.go:656] Stopping watch factory\\\\nI1203 22:06:28.690279 6805 ovnkube.go:599] Stopped ovnkube\\\\nI1203 22:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.505223 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.520965 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.537308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.555935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.555996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.556014 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.556039 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.556056 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.556170 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.576943 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.599120 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.617682 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.633169 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.647543 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.658592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.658635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.658653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.658678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.658698 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.666586 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.684594 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.703002 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.725734 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:41Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.762299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.762347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.762371 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.762400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.762422 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.865016 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.865458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.865651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.865804 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.865933 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.969490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.969584 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.969604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.969627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:41 crc kubenswrapper[4830]: I1203 22:06:41.969645 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:41Z","lastTransitionTime":"2025-12-03T22:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.073378 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.073456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.073480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.073561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.073589 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.176857 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.177227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.177493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.177737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.177906 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.281784 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.281853 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.281870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.281901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.281921 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.336736 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.336798 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.336850 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:42 crc kubenswrapper[4830]: E1203 22:06:42.337767 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:42 crc kubenswrapper[4830]: E1203 22:06:42.337859 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:42 crc kubenswrapper[4830]: E1203 22:06:42.337960 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.338296 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:06:42 crc kubenswrapper[4830]: E1203 22:06:42.338676 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.384445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.384503 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.384589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.384617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.384637 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.487809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.487884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.487908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.487938 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.487958 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.590794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.590889 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.590910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.590936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.590957 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.693161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.693221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.693239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.693262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.693279 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.795949 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.796026 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.796047 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.796071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.796090 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.898558 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.898623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.898645 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.898674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:42 crc kubenswrapper[4830]: I1203 22:06:42.898697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:42Z","lastTransitionTime":"2025-12-03T22:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.001696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.001749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.001762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.001781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.001794 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.103800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.103866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.103886 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.103911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.103929 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.207241 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.207311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.207330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.207357 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.207373 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.310275 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.310316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.310329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.310345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.310357 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.336698 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:43 crc kubenswrapper[4830]: E1203 22:06:43.336924 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.413703 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.413761 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.413774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.413797 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.413806 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.517087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.517158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.517176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.517204 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.517224 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.620934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.621116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.621155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.621249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.621306 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.725435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.725495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.725544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.725573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.725592 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.829638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.829699 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.829714 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.829735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.829754 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.933033 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.933106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.933118 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.933136 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:43 crc kubenswrapper[4830]: I1203 22:06:43.933147 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:43Z","lastTransitionTime":"2025-12-03T22:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.035997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.036072 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.036089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.036117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.036134 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.139503 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.139606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.139623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.139647 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.139668 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.242029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.242092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.242111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.242134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.242164 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.336960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.337015 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.336960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:44 crc kubenswrapper[4830]: E1203 22:06:44.337188 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:44 crc kubenswrapper[4830]: E1203 22:06:44.337352 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:44 crc kubenswrapper[4830]: E1203 22:06:44.337416 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.345203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.345260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.345278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.345301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.345318 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.447966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.448041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.448060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.448094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.448117 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.551138 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.551213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.551232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.551258 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.551291 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.654623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.654682 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.654700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.654723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.654740 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.757504 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.757603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.757620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.757643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.757663 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.860623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.860693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.860717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.860745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.860766 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.964068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.964163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.964181 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.964201 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:44 crc kubenswrapper[4830]: I1203 22:06:44.964218 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:44Z","lastTransitionTime":"2025-12-03T22:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.067579 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.067638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.067655 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.067681 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.067699 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.171328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.171380 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.171395 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.171414 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.171427 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.274818 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.274877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.274895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.274921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.274937 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.336742 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:45 crc kubenswrapper[4830]: E1203 22:06:45.336911 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.377825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.377907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.377940 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.377959 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.377972 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.481059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.481109 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.481129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.481153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.481172 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.584590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.584719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.584740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.584764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.584782 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.688120 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.688213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.688230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.688254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.688271 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.791855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.791917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.791935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.791960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.791978 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.894898 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.894969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.894982 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.895024 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.895038 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.997916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.997971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.997986 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.998003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:45 crc kubenswrapper[4830]: I1203 22:06:45.998017 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:45Z","lastTransitionTime":"2025-12-03T22:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.100521 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.100785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.100794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.100807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.100818 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.203658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.203717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.203742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.203771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.203791 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.306868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.306902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.306910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.306923 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.306933 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.336629 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.336695 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:46 crc kubenswrapper[4830]: E1203 22:06:46.336825 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.336876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:46 crc kubenswrapper[4830]: E1203 22:06:46.337181 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:46 crc kubenswrapper[4830]: E1203 22:06:46.337356 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.410078 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.410132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.410148 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.410172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.410189 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.512782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.512867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.512883 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.512900 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.512913 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.615113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.615166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.615197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.615211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.615222 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.718302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.718366 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.718376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.718439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.718453 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.822021 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.822087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.822107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.822131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.822148 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.925480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.925539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.925549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.925567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:46 crc kubenswrapper[4830]: I1203 22:06:46.925581 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:46Z","lastTransitionTime":"2025-12-03T22:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.028995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.029070 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.029092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.029116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.029136 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.132676 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.132721 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.132735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.132759 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.132776 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.235975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.236042 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.236053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.236076 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.236090 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.336285 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:47 crc kubenswrapper[4830]: E1203 22:06:47.336560 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.339178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.339237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.339253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.339276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.339294 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.443120 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.443191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.443210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.443236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.443254 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.545759 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.545823 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.545845 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.545875 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.545897 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.649685 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.649744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.649765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.649799 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.649818 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.753753 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.753816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.753832 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.753854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.753871 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.856693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.856742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.856754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.856771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.856783 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.959342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.959389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.959401 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.959419 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:47 crc kubenswrapper[4830]: I1203 22:06:47.959434 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:47Z","lastTransitionTime":"2025-12-03T22:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.062979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.063044 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.063063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.063089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.063107 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.165700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.165749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.165762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.165779 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.165790 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.268577 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.268625 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.268650 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.268678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.268699 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.335972 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.336093 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.336250 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.336290 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.336377 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.336424 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.371207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.371251 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.371264 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.371279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.371291 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.474916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.474958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.474967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.474983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.474991 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.577960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.578020 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.578039 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.578075 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.578093 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.664211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.664269 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.664288 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.664311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.664328 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.687025 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.692139 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.692198 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.692218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.692243 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.692261 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.713881 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.718700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.718745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.718764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.718787 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.718804 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.738887 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.743966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.744033 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.744057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.744089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.744117 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.764816 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.769724 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.769794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.769812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.769838 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.769856 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.789608 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5096e846-2f08-4706-b180-cb04a3bb9612\\\",\\\"systemUUID\\\":\\\"650ea5bb-184d-4066-8107-1bf795365c7c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:48Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:48 crc kubenswrapper[4830]: E1203 22:06:48.789842 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.791830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.791881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.791897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.791921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.791939 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.895372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.895429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.895446 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.895468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.895488 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.998166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.998262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.998282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.998314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:48 crc kubenswrapper[4830]: I1203 22:06:48.998340 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:48Z","lastTransitionTime":"2025-12-03T22:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.101308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.101700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.101907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.102134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.102338 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.205861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.206116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.206279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.206457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.206695 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.246037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:49 crc kubenswrapper[4830]: E1203 22:06:49.246296 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:49 crc kubenswrapper[4830]: E1203 22:06:49.246420 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs podName:211b6f37-bd3f-475e-b4d9-e3d94ae07c52 nodeName:}" failed. No retries permitted until 2025-12-03 22:07:53.246391075 +0000 UTC m=+162.242852464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs") pod "network-metrics-daemon-zlcmr" (UID: "211b6f37-bd3f-475e-b4d9-e3d94ae07c52") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.310468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.310605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.310634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.310669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.310694 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.337233 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:49 crc kubenswrapper[4830]: E1203 22:06:49.337598 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.413578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.413634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.413660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.413691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.413714 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.516869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.516920 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.516936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.516960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.516977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.620171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.620267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.620307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.620346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.620368 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.726390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.726445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.726499 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.726554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.726573 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.829417 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.829487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.829496 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.829547 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.829561 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.932220 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.932260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.932270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.932284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:49 crc kubenswrapper[4830]: I1203 22:06:49.932293 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:49Z","lastTransitionTime":"2025-12-03T22:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.034275 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.034329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.034345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.034367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.034383 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.137721 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.137774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.137791 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.137818 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.137837 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.241065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.241311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.241366 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.241396 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.241419 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.336910 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.336956 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:50 crc kubenswrapper[4830]: E1203 22:06:50.337232 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.337291 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:50 crc kubenswrapper[4830]: E1203 22:06:50.337364 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:50 crc kubenswrapper[4830]: E1203 22:06:50.337651 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.344546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.344596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.344619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.344643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.344664 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.447331 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.447370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.447403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.447421 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.447430 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.549984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.550066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.550082 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.550101 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.550113 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.653073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.653165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.653183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.653209 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.653257 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.757086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.757147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.757165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.757187 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.757205 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.860257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.860330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.860351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.860378 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.860397 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.963012 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.963073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.963091 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.963116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:50 crc kubenswrapper[4830]: I1203 22:06:50.963135 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:50Z","lastTransitionTime":"2025-12-03T22:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.066249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.066334 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.066363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.066394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.066418 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.169827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.169899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.169921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.169950 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.169968 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.274070 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.274149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.274167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.274195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.274213 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.336767 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:51 crc kubenswrapper[4830]: E1203 22:06:51.336944 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.362453 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.377606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.377668 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.377685 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.377709 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.377728 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.382161 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d554a8124083268a7fbaf48c99be2a6b4b303da5dd71e1355b6a255c3c91c970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.418203 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44a18320-6162-4fc5-a89c-363c4c6cd030\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:28Z\\\",\\\"message\\\":\\\"203 22:06:28.689994 6805 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 22:06:28.690024 6805 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 22:06:28.690053 6805 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 22:06:28.690061 6805 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 22:06:28.690087 6805 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 22:06:28.690118 6805 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 22:06:28.690123 6805 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 22:06:28.690135 6805 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 22:06:28.690143 6805 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 22:06:28.690150 6805 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 22:06:28.690156 6805 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 22:06:28.690167 6805 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 22:06:28.690187 6805 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 22:06:28.690166 6805 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 22:06:28.690247 6805 factory.go:656] Stopping watch factory\\\\nI1203 22:06:28.690279 6805 ovnkube.go:599] Stopped ovnkube\\\\nI1203 22:06:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:06:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sktrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5vgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.446865 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"080247dd-b7ea-44e0-9145-da0eeade0107\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23db9f9ef5c3cbf25f40d85bcd17ee0c8605b31a8781b6bfe390322f7141f1ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4019e12aeb644bdbfb19aa2a7afeeff81095b142536ab28e49402ca4e75a13b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0c80ee09611dcdbaee8a09e036e2944a666b5a6dd6e8648a841c586de411856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef4a0b95f8089d523a37563e42a5d2e503f5b74c05bb5323ff1dbc18625127e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://459f6bcca66a5a5bfde1c0b56edc0aab8ee2313381e3d692a3f6da06aa58a364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2eef1c474f977a0066c1c26f10c8d25cc7d6281a2f5e00a264c5b0e442b8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7317cfd97e5fe64f0d0b79199e12d1aef74b81e290a06daa68eb86bfdcac6de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wtl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wdcn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.485604 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.485658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.485670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.485686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.485697 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.494796 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wzd28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b81e3c78-e222-410b-8cca-a4ba48f72f87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67a1f4a3ff797e28b08a2db72e347f5d7ac5a0e84a6a6db5a8027289aa63cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwtbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wzd28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.515193 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e93adf64-73b3-4e5c-b5b5-9f451240c88f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77eccb2ab451dfe1e6f51e4063cccb3eeca8b21e734e3a705c487216476cfb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89c1f0001e07c26b9ffe8fdc8b5f74c9b8b42abd2e9df52158e53af21224e03b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.531889 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91f7641c-69eb-4471-b294-ed60f8362d7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW1203 22:05:29.946787 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1203 22:05:29.946937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 22:05:29.948832 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1876776915/tls.crt::/tmp/serving-cert-1876776915/tls.key\\\\\\\"\\\\nI1203 22:05:30.417600 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 22:05:30.427800 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 22:05:30.427831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 22:05:30.427852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 22:05:30.427876 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 22:05:30.433082 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 22:05:30.433117 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433122 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 22:05:30.433127 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 22:05:30.433131 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 22:05:30.433135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 22:05:30.433138 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 22:05:30.433163 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 22:05:30.445438 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.546619 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.564358 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a363f1fa67c00c56f6d94192093de3d2fd472db75ff9c00021d44ec63bedcea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.576629 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19b7b77-9efe-4ebf-b9a4-f6253923cbc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e790242573d268c7999fc1cd8c4d57b167f26cc16035679abac21da503810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7befe89f78c054752d772feaa1eb48cc15fa2968f9c56b5f1a9da9dc16a6e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4dc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9tz2v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.587840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.587885 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.587899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.587916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.587928 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.592151 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bggws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zlcmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.607628 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pr4cr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2b0660f-4a2e-4d96-829a-fc54cbf92f0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d593ab7e7f767a6640899b01b3f3e00e20b63baee54888de08c9f8c40d5bfc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qznjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pr4cr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.622640 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sh485" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdccedf8-f580-49f0-848e-108c748d8a21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T22:06:17Z\\\",\\\"message\\\":\\\"2025-12-03T22:05:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e\\\\n2025-12-03T22:05:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3c05ef7d-1cce-4c1c-822c-124b7f51f84e to /host/opt/cni/bin/\\\\n2025-12-03T22:05:32Z [verbose] multus-daemon started\\\\n2025-12-03T22:05:32Z [verbose] Readiness Indicator file check\\\\n2025-12-03T22:06:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blsqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sh485\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.636359 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be992564-5ce8-4a23-b65a-2661fc3c332c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c8c1506447f5f0b55108a95438f8c1effabf9515cb4480bfad7a92f9a823a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4737d7e165ac7627c4964f199eb057496ad84c2149faa0de396b60ea6f7a184b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6f51f2d51d4da8df7b75ba949087796150afc378d059d40816391dc27877bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.647584 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51912499-b1e0-4923-9a74-5a34ddb74566\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6387d54218aec7acae636fa2ca2a5f9fca9adbcc29a79b2adcf1d1a676829909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2891347ccb0ff3503c2db3c5a848048d855db0e6c8dcd7ef3871e280749fb1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://335a31fe8956a343444d7d7d75ad38385556ad3b35fde9defb22aa6192ee079c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c86f927cc05b1a64b0d1ded4bd14efb900e83f73935a53f577a1460bf1592fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.668364 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35b731ede2d1bc2bb5d0ff0eadec8d5ae63ed603e11d41da42711486ca43a49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa0662b0e2ca413c387a8d517afa0ec23b1e93eb78c0394698c87910b750f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.689262 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dfabca54-3c7a-4d2f-9cc3-b56973c94b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d8d94615fa5b7d93417526d5262f9b4d94093a8b23f7c686596150494de853e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8410e9bc7eb13df8f79aa63929fe8fc96fbe6f8f144748e84d3169d5b299fc3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13f7255cea99b35c25e86e2780bdece7a20b2d88cc86fe3b632b8ad748b1235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceebadcd3bc4b4c4d8bb59de2287ddc833f878885c30bcf26dfc8fb3c64fb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://943c9b15977cc0a2677612a5848722253b59dd7efcb78a12fa360a90a77a55b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f748b1d12c3567416a9be897923629484d3f139b0e908a0a2451a24b88da494e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df6bb0c2f993d2adb1770e2887e2f3cb9e8555e4646763b86d12cc84f432432c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ee6ba013f7f1bc485d995783e9d5c7d58a9bf56ee4608c013c907e0ca27e0fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T22:05:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T22:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.691101 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.691126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.691134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.691146 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.691155 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.709971 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.729721 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T22:05:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed355dbd400b9a6a95d99a2b1190e27f0f0cf2cce93b0762ee7d6b37805ee18c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4kbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T22:05:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nfl7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T22:06:51Z is after 2025-08-24T17:21:41Z" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.794132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.794194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.794211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.794237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.794256 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.897370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.897451 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.897474 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.897537 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:51 crc kubenswrapper[4830]: I1203 22:06:51.897563 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:51Z","lastTransitionTime":"2025-12-03T22:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.000688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.000777 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.000793 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.000818 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.000836 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.103822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.103890 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.103908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.103932 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.103950 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.206882 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.206970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.206989 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.207016 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.207033 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.310165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.310262 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.310279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.310306 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.310329 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.336715 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.336766 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.336838 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:52 crc kubenswrapper[4830]: E1203 22:06:52.336887 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:52 crc kubenswrapper[4830]: E1203 22:06:52.337007 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:52 crc kubenswrapper[4830]: E1203 22:06:52.337191 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.413757 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.413829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.413849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.413874 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.413892 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.517251 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.517305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.517320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.517339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.517358 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.625063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.625153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.625184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.625225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.625251 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.730015 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.730069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.730086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.730110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.730131 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.833672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.833754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.833777 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.833814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.833835 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.937110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.937186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.937203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.937230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:52 crc kubenswrapper[4830]: I1203 22:06:52.937248 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:52Z","lastTransitionTime":"2025-12-03T22:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.040266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.040333 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.040356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.040387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.040410 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.143660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.143723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.143741 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.143765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.143784 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.247407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.247485 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.247542 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.247574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.247599 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.336541 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:53 crc kubenswrapper[4830]: E1203 22:06:53.337008 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.351207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.351271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.351296 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.351325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.351349 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.454003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.454064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.454084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.454107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.454124 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.556673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.556741 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.556763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.556793 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.556815 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.659903 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.659963 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.659978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.659998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.660012 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.763386 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.763445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.763458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.763479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.763493 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.866887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.866953 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.866970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.866994 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.867012 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.970032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.970071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.970083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.970099 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:53 crc kubenswrapper[4830]: I1203 22:06:53.970110 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:53Z","lastTransitionTime":"2025-12-03T22:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.073093 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.073156 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.073174 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.073197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.073215 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.175978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.176052 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.176069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.176094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.176111 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.278569 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.278633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.278651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.278677 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.278694 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.336388 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.336446 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.336458 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:54 crc kubenswrapper[4830]: E1203 22:06:54.336639 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:54 crc kubenswrapper[4830]: E1203 22:06:54.336972 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:54 crc kubenswrapper[4830]: E1203 22:06:54.337213 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.382429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.382497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.382557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.382590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.382615 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.484749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.484816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.484840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.484985 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.485014 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.588252 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.588311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.588328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.588350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.588368 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.691473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.691591 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.691614 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.691642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.691665 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.794587 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.794653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.794670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.794696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.794713 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.897984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.898060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.898078 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.898106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:54 crc kubenswrapper[4830]: I1203 22:06:54.898125 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:54Z","lastTransitionTime":"2025-12-03T22:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.000581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.000666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.000686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.000717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.000743 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.104147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.104226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.104249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.104282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.104303 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.207809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.207894 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.207913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.207943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.207973 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.310826 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.310913 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.310939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.310975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.310998 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.337170 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:55 crc kubenswrapper[4830]: E1203 22:06:55.337695 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.418373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.418541 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.418563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.418602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.418665 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.524574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.524618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.524634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.524684 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.524701 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.627283 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.627329 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.627348 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.627370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.627386 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.730460 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.730530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.730541 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.730562 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.730575 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.833809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.833868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.833888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.833915 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.833932 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.937243 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.937392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.937408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.937434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:55 crc kubenswrapper[4830]: I1203 22:06:55.937449 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:55Z","lastTransitionTime":"2025-12-03T22:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.040254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.040326 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.040351 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.040381 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.040403 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.143302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.143372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.143388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.143411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.143430 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.246114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.246187 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.246205 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.246229 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.246248 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.336355 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.336355 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.336483 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:56 crc kubenswrapper[4830]: E1203 22:06:56.336729 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:56 crc kubenswrapper[4830]: E1203 22:06:56.336892 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:56 crc kubenswrapper[4830]: E1203 22:06:56.337114 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.338270 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:06:56 crc kubenswrapper[4830]: E1203 22:06:56.338569 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.349558 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.349623 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.349642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.349666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.349685 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.452802 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.452865 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.452882 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.452908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.452926 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.556812 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.556915 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.556949 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.556988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.557023 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.659690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.659747 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.659765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.659789 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.659806 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.762936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.763072 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.763098 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.763128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.763152 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.866313 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.866377 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.866396 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.866423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.866442 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.968945 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.969007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.969021 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.969045 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:56 crc kubenswrapper[4830]: I1203 22:06:56.969064 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:56Z","lastTransitionTime":"2025-12-03T22:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.072407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.072473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.072491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.072553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.072573 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.176249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.176318 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.176343 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.176428 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.176446 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.280097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.280173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.280197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.280226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.280249 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.336440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:57 crc kubenswrapper[4830]: E1203 22:06:57.336966 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.383403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.383458 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.383473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.383492 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.383539 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.486302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.486370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.486387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.486412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.486431 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.589421 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.589563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.589588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.589611 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.589632 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.692470 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.692551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.692571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.692606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.692705 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.795968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.796027 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.796048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.796074 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.796096 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.898634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.899195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.899696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.899784 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:57 crc kubenswrapper[4830]: I1203 22:06:57.899810 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:57Z","lastTransitionTime":"2025-12-03T22:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.003480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.003572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.003595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.003624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.003646 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.106741 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.106781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.106807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.106825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.106834 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.209419 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.209490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.209545 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.209568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.209581 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.312909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.312982 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.313006 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.313038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.313059 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.336138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.336271 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.336279 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:06:58 crc kubenswrapper[4830]: E1203 22:06:58.336450 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:06:58 crc kubenswrapper[4830]: E1203 22:06:58.336599 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:06:58 crc kubenswrapper[4830]: E1203 22:06:58.336807 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.415971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.416030 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.416049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.416073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.416091 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.519074 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.519212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.519231 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.519254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.519270 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.623043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.623119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.623140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.623170 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.623194 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.725965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.726021 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.726039 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.726062 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.726082 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.828911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.828971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.828987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.829012 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.829029 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.932466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.932577 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.932601 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.932633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.932657 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.967087 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.967140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.967154 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.967172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 22:06:58 crc kubenswrapper[4830]: I1203 22:06:58.967184 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T22:06:58Z","lastTransitionTime":"2025-12-03T22:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.035786 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75"] Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.036387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.038557 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.039210 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.039449 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.039668 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.061378 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.061357288 podStartE2EDuration="1m25.061357288s" podCreationTimestamp="2025-12-03 22:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.060480994 +0000 UTC m=+108.056942383" watchObservedRunningTime="2025-12-03 22:06:59.061357288 +0000 UTC m=+108.057818677" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.064122 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0609ee9c-5b69-4339-8999-515d13005b71-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.064183 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0609ee9c-5b69-4339-8999-515d13005b71-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.064252 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.064334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.064377 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0609ee9c-5b69-4339-8999-515d13005b71-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.081123 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.081101103 podStartE2EDuration="1m0.081101103s" podCreationTimestamp="2025-12-03 22:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.0810063 +0000 UTC m=+108.077467729" watchObservedRunningTime="2025-12-03 22:06:59.081101103 +0000 UTC m=+108.077562462" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.113744 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pr4cr" podStartSLOduration=89.113700233 podStartE2EDuration="1m29.113700233s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.113535818 +0000 UTC m=+108.109997167" watchObservedRunningTime="2025-12-03 22:06:59.113700233 +0000 UTC m=+108.110161612" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.138111 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sh485" podStartSLOduration=89.138090947 podStartE2EDuration="1m29.138090947s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.126648031 +0000 UTC m=+108.123109390" watchObservedRunningTime="2025-12-03 22:06:59.138090947 +0000 UTC m=+108.134552306" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.165604 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0609ee9c-5b69-4339-8999-515d13005b71-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.165691 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0609ee9c-5b69-4339-8999-515d13005b71-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.165777 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.165884 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.165927 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0609ee9c-5b69-4339-8999-515d13005b71-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.166026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.166111 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0609ee9c-5b69-4339-8999-515d13005b71-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.167863 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0609ee9c-5b69-4339-8999-515d13005b71-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.173212 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0609ee9c-5b69-4339-8999-515d13005b71-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.178580 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=85.178551473 podStartE2EDuration="1m25.178551473s" podCreationTimestamp="2025-12-03 22:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.177036352 +0000 UTC m=+108.173497701" watchObservedRunningTime="2025-12-03 22:06:59.178551473 +0000 UTC m=+108.175012862" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.193781 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0609ee9c-5b69-4339-8999-515d13005b71-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wwj75\" (UID: \"0609ee9c-5b69-4339-8999-515d13005b71\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.208031 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podStartSLOduration=89.208015197 podStartE2EDuration="1m29.208015197s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.206947087 +0000 UTC m=+108.203408456" watchObservedRunningTime="2025-12-03 22:06:59.208015197 +0000 UTC m=+108.204476556" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.236023 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wdcn6" podStartSLOduration=89.23600305 podStartE2EDuration="1m29.23600305s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.235707772 +0000 UTC m=+108.232169151" watchObservedRunningTime="2025-12-03 22:06:59.23600305 +0000 UTC m=+108.232464429" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.269684 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wzd28" podStartSLOduration=89.269665759 podStartE2EDuration="1m29.269665759s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.256236418 +0000 UTC m=+108.252697797" watchObservedRunningTime="2025-12-03 22:06:59.269665759 +0000 UTC m=+108.266127138" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.291091 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=35.291068749 podStartE2EDuration="35.291068749s" podCreationTimestamp="2025-12-03 22:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.2707863 +0000 UTC m=+108.267247679" watchObservedRunningTime="2025-12-03 22:06:59.291068749 +0000 UTC m=+108.287530138" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.312168 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.312148732 podStartE2EDuration="1m28.312148732s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.291639756 +0000 UTC m=+108.288101135" watchObservedRunningTime="2025-12-03 22:06:59.312148732 +0000 UTC m=+108.308610121" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.340268 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:06:59 crc kubenswrapper[4830]: E1203 22:06:59.340424 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.366889 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.424445 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9tz2v" podStartSLOduration=88.424421991 podStartE2EDuration="1m28.424421991s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:59.409607312 +0000 UTC m=+108.406068701" watchObservedRunningTime="2025-12-03 22:06:59.424421991 +0000 UTC m=+108.420883370" Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.995414 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" event={"ID":"0609ee9c-5b69-4339-8999-515d13005b71","Type":"ContainerStarted","Data":"10dddfc787f9d6760dfd7a3ef08c97ce2a8e311ee63b50d63f0fe17c8064943f"} Dec 03 22:06:59 crc kubenswrapper[4830]: I1203 22:06:59.995499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" event={"ID":"0609ee9c-5b69-4339-8999-515d13005b71","Type":"ContainerStarted","Data":"d8bff93465877d855c2c79c9902f38a108efa4624c526aac9e3f2de6e118cf5b"} Dec 03 22:07:00 crc kubenswrapper[4830]: I1203 22:07:00.336358 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:00 crc kubenswrapper[4830]: I1203 22:07:00.336451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:00 crc kubenswrapper[4830]: I1203 22:07:00.336370 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:00 crc kubenswrapper[4830]: E1203 22:07:00.336625 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:00 crc kubenswrapper[4830]: E1203 22:07:00.336765 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:00 crc kubenswrapper[4830]: E1203 22:07:00.336898 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:01 crc kubenswrapper[4830]: I1203 22:07:01.336492 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:01 crc kubenswrapper[4830]: E1203 22:07:01.339261 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:02 crc kubenswrapper[4830]: I1203 22:07:02.336610 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:02 crc kubenswrapper[4830]: E1203 22:07:02.336832 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:02 crc kubenswrapper[4830]: I1203 22:07:02.337241 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:02 crc kubenswrapper[4830]: E1203 22:07:02.337391 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:02 crc kubenswrapper[4830]: I1203 22:07:02.337726 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:02 crc kubenswrapper[4830]: E1203 22:07:02.337874 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:03 crc kubenswrapper[4830]: I1203 22:07:03.335992 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:03 crc kubenswrapper[4830]: E1203 22:07:03.336181 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.011473 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/1.log" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.012024 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/0.log" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.012082 4830 generic.go:334] "Generic (PLEG): container finished" podID="bdccedf8-f580-49f0-848e-108c748d8a21" containerID="d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7" exitCode=1 Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.012115 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerDied","Data":"d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7"} Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.012153 4830 scope.go:117] "RemoveContainer" containerID="7e2da22cc8f8ae374580d170feeeb9b09aa0831157e157a6950539c27aad3a98" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.012592 4830 scope.go:117] "RemoveContainer" containerID="d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7" Dec 03 22:07:04 crc kubenswrapper[4830]: E1203 22:07:04.012748 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-sh485_openshift-multus(bdccedf8-f580-49f0-848e-108c748d8a21)\"" pod="openshift-multus/multus-sh485" podUID="bdccedf8-f580-49f0-848e-108c748d8a21" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.051731 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wwj75" podStartSLOduration=94.051711918 podStartE2EDuration="1m34.051711918s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:00.012886627 +0000 UTC m=+109.009348056" watchObservedRunningTime="2025-12-03 22:07:04.051711918 +0000 UTC m=+113.048173287" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.336472 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.336554 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:04 crc kubenswrapper[4830]: I1203 22:07:04.336620 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:04 crc kubenswrapper[4830]: E1203 22:07:04.336691 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:04 crc kubenswrapper[4830]: E1203 22:07:04.336812 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:04 crc kubenswrapper[4830]: E1203 22:07:04.336915 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:05 crc kubenswrapper[4830]: I1203 22:07:05.017943 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/1.log" Dec 03 22:07:05 crc kubenswrapper[4830]: I1203 22:07:05.336954 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:05 crc kubenswrapper[4830]: E1203 22:07:05.337166 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:06 crc kubenswrapper[4830]: I1203 22:07:06.336907 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:06 crc kubenswrapper[4830]: I1203 22:07:06.336960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:06 crc kubenswrapper[4830]: I1203 22:07:06.337004 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:06 crc kubenswrapper[4830]: E1203 22:07:06.337089 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:06 crc kubenswrapper[4830]: E1203 22:07:06.337229 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:06 crc kubenswrapper[4830]: E1203 22:07:06.337367 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:07 crc kubenswrapper[4830]: I1203 22:07:07.336776 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:07 crc kubenswrapper[4830]: E1203 22:07:07.337134 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:08 crc kubenswrapper[4830]: I1203 22:07:08.336344 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:08 crc kubenswrapper[4830]: I1203 22:07:08.336402 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:08 crc kubenswrapper[4830]: I1203 22:07:08.336432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:08 crc kubenswrapper[4830]: E1203 22:07:08.337268 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:08 crc kubenswrapper[4830]: E1203 22:07:08.337608 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:08 crc kubenswrapper[4830]: E1203 22:07:08.337652 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:08 crc kubenswrapper[4830]: I1203 22:07:08.338292 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:07:08 crc kubenswrapper[4830]: E1203 22:07:08.338726 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5vgkl_openshift-ovn-kubernetes(44a18320-6162-4fc5-a89c-363c4c6cd030)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" Dec 03 22:07:09 crc kubenswrapper[4830]: I1203 22:07:09.336589 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:09 crc kubenswrapper[4830]: E1203 22:07:09.336771 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:10 crc kubenswrapper[4830]: I1203 22:07:10.336913 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:10 crc kubenswrapper[4830]: I1203 22:07:10.336966 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:10 crc kubenswrapper[4830]: I1203 22:07:10.337011 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:10 crc kubenswrapper[4830]: E1203 22:07:10.337150 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:10 crc kubenswrapper[4830]: E1203 22:07:10.337286 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:10 crc kubenswrapper[4830]: E1203 22:07:10.337493 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:11 crc kubenswrapper[4830]: E1203 22:07:11.282375 4830 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 22:07:11 crc kubenswrapper[4830]: I1203 22:07:11.336954 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:11 crc kubenswrapper[4830]: E1203 22:07:11.338123 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:11 crc kubenswrapper[4830]: E1203 22:07:11.444499 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:07:12 crc kubenswrapper[4830]: I1203 22:07:12.336635 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:12 crc kubenswrapper[4830]: E1203 22:07:12.336773 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:12 crc kubenswrapper[4830]: I1203 22:07:12.336917 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:12 crc kubenswrapper[4830]: I1203 22:07:12.336987 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:12 crc kubenswrapper[4830]: E1203 22:07:12.337061 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:12 crc kubenswrapper[4830]: E1203 22:07:12.337124 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:13 crc kubenswrapper[4830]: I1203 22:07:13.336429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:13 crc kubenswrapper[4830]: E1203 22:07:13.336663 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:14 crc kubenswrapper[4830]: I1203 22:07:14.336829 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:14 crc kubenswrapper[4830]: I1203 22:07:14.336950 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:14 crc kubenswrapper[4830]: I1203 22:07:14.336833 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:14 crc kubenswrapper[4830]: E1203 22:07:14.337000 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:14 crc kubenswrapper[4830]: E1203 22:07:14.337127 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:14 crc kubenswrapper[4830]: E1203 22:07:14.337243 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:15 crc kubenswrapper[4830]: I1203 22:07:15.336989 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:15 crc kubenswrapper[4830]: E1203 22:07:15.337185 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:16 crc kubenswrapper[4830]: I1203 22:07:16.336985 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:16 crc kubenswrapper[4830]: I1203 22:07:16.337077 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:16 crc kubenswrapper[4830]: I1203 22:07:16.337092 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:16 crc kubenswrapper[4830]: E1203 22:07:16.337254 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:16 crc kubenswrapper[4830]: E1203 22:07:16.337416 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:16 crc kubenswrapper[4830]: E1203 22:07:16.337565 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:16 crc kubenswrapper[4830]: E1203 22:07:16.445885 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:07:17 crc kubenswrapper[4830]: I1203 22:07:17.336800 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:17 crc kubenswrapper[4830]: E1203 22:07:17.337003 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:17 crc kubenswrapper[4830]: I1203 22:07:17.337767 4830 scope.go:117] "RemoveContainer" containerID="d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7" Dec 03 22:07:18 crc kubenswrapper[4830]: I1203 22:07:18.070804 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/1.log" Dec 03 22:07:18 crc kubenswrapper[4830]: I1203 22:07:18.071258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerStarted","Data":"f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e"} Dec 03 22:07:18 crc kubenswrapper[4830]: I1203 22:07:18.336440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:18 crc kubenswrapper[4830]: I1203 22:07:18.336534 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:18 crc kubenswrapper[4830]: I1203 22:07:18.336559 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:18 crc kubenswrapper[4830]: E1203 22:07:18.336674 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:18 crc kubenswrapper[4830]: E1203 22:07:18.336840 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:18 crc kubenswrapper[4830]: E1203 22:07:18.337043 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:19 crc kubenswrapper[4830]: I1203 22:07:19.336404 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:19 crc kubenswrapper[4830]: E1203 22:07:19.337147 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:19 crc kubenswrapper[4830]: I1203 22:07:19.337873 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.081170 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/3.log" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.084650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerStarted","Data":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.085285 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.132062 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podStartSLOduration=110.13203739 podStartE2EDuration="1m50.13203739s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:20.12986142 +0000 UTC m=+129.126322869" watchObservedRunningTime="2025-12-03 22:07:20.13203739 +0000 UTC m=+129.128498779" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.269670 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zlcmr"] Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.269921 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:20 crc kubenswrapper[4830]: E1203 22:07:20.270132 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.336061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.336126 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:20 crc kubenswrapper[4830]: I1203 22:07:20.336192 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:20 crc kubenswrapper[4830]: E1203 22:07:20.336279 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:20 crc kubenswrapper[4830]: E1203 22:07:20.336396 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:20 crc kubenswrapper[4830]: E1203 22:07:20.336545 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:21 crc kubenswrapper[4830]: E1203 22:07:21.446601 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:07:22 crc kubenswrapper[4830]: I1203 22:07:22.336844 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:22 crc kubenswrapper[4830]: I1203 22:07:22.336916 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:22 crc kubenswrapper[4830]: I1203 22:07:22.336919 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:22 crc kubenswrapper[4830]: E1203 22:07:22.337040 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:22 crc kubenswrapper[4830]: I1203 22:07:22.337075 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:22 crc kubenswrapper[4830]: E1203 22:07:22.337225 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:22 crc kubenswrapper[4830]: E1203 22:07:22.337373 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:22 crc kubenswrapper[4830]: E1203 22:07:22.337451 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:24 crc kubenswrapper[4830]: I1203 22:07:24.336540 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:24 crc kubenswrapper[4830]: I1203 22:07:24.336580 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:24 crc kubenswrapper[4830]: I1203 22:07:24.336621 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:24 crc kubenswrapper[4830]: E1203 22:07:24.336720 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:24 crc kubenswrapper[4830]: I1203 22:07:24.336765 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:24 crc kubenswrapper[4830]: E1203 22:07:24.336924 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:24 crc kubenswrapper[4830]: E1203 22:07:24.337091 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:24 crc kubenswrapper[4830]: E1203 22:07:24.337210 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:26 crc kubenswrapper[4830]: I1203 22:07:26.336822 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:26 crc kubenswrapper[4830]: I1203 22:07:26.336858 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:26 crc kubenswrapper[4830]: I1203 22:07:26.336958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:26 crc kubenswrapper[4830]: I1203 22:07:26.336784 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:26 crc kubenswrapper[4830]: E1203 22:07:26.337026 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 22:07:26 crc kubenswrapper[4830]: E1203 22:07:26.337150 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 22:07:26 crc kubenswrapper[4830]: E1203 22:07:26.337220 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 22:07:26 crc kubenswrapper[4830]: E1203 22:07:26.337314 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zlcmr" podUID="211b6f37-bd3f-475e-b4d9-e3d94ae07c52" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.336371 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.336433 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.336446 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.336557 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.339689 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.341276 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.341630 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.343153 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.343314 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 22:07:28 crc kubenswrapper[4830]: I1203 22:07:28.345138 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.577829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.637340 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.638002 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.639096 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wt5n6"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.639953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.642082 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.643334 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.644164 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.644443 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.644766 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.645447 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.653082 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.653378 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.653728 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.654023 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.654267 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.654317 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.654634 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.654726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.655099 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.655153 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.655352 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.655399 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.655968 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.656192 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lz46c"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.656906 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.667184 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.667648 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.668858 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c8f8r"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.669836 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtqqc"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.670636 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.671094 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.671394 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.697203 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.697921 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lq5x8"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.698543 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.698648 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.699157 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.699872 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.700078 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.701105 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.701171 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.701237 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.703252 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.704029 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.707817 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.708232 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.708777 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.709136 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.709374 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.713201 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.714050 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.714500 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.714919 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.724274 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.725066 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.725644 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.725908 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.726052 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.726172 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.726232 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.726325 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.726395 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.728857 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729040 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729195 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729318 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729415 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729500 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729632 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729730 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729969 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.730352 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.730451 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.730602 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.730655 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.730707 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731007 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731254 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731327 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731470 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731609 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731766 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731896 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732016 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732115 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732236 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732355 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732474 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732790 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.732907 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733017 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733140 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733242 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733343 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733433 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733547 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733644 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.733747 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.731263 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.738378 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.738487 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.738927 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.739990 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.740011 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.729527 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.740239 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.743116 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751094 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751248 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-encryption-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751275 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5604c165-492d-4736-848c-254474834852-audit-dir\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751297 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-trusted-ca\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751376 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751415 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvkbc\" (UniqueName: \"kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpsv\" (UniqueName: \"kubernetes.io/projected/9e609cc9-2f55-41ad-8234-f57ef9928b69-kube-api-access-gnpsv\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751467 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751486 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-serving-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751522 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751553 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751572 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-client\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-encryption-config\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751673 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8a7d50-89ba-4804-8ad8-ae427909d60e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-node-pullsecrets\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751724 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-auth-proxy-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751745 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zhx\" (UniqueName: \"kubernetes.io/projected/bf8a7d50-89ba-4804-8ad8-ae427909d60e-kube-api-access-t4zhx\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-serving-cert\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751879 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.751914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit-dir\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752006 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-etcd-client\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752082 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-audit-policies\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752138 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f19636bc-ac54-4bfb-a75f-63049dd5c460-machine-approver-tls\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-images\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-serving-cert\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752361 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752382 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752426 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752464 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzs4\" (UniqueName: \"kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752488 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752521 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e609cc9-2f55-41ad-8234-f57ef9928b69-serving-cert\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752566 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pzhf\" (UniqueName: \"kubernetes.io/projected/5604c165-492d-4736-848c-254474834852-kube-api-access-9pzhf\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752590 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgps\" (UniqueName: \"kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63cd790-9ef7-4a09-b132-e2f85e4310ce-serving-cert\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752674 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4pg\" (UniqueName: \"kubernetes.io/projected/e63cd790-9ef7-4a09-b132-e2f85e4310ce-kube-api-access-mn4pg\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxkv\" (UniqueName: \"kubernetes.io/projected/f19636bc-ac54-4bfb-a75f-63049dd5c460-kube-api-access-tkxkv\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752769 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752813 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752836 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-image-import-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752860 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff0d92-ab2f-4815-9659-7b4507d64344-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752894 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqvf5\" (UniqueName: \"kubernetes.io/projected/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-kube-api-access-jqvf5\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752949 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-config\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752974 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-config\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.752990 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q44nt\" (UniqueName: \"kubernetes.io/projected/b9ff0d92-ab2f-4815-9659-7b4507d64344-kube-api-access-q44nt\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.753008 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-config\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.755165 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.756487 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.756926 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.757321 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.758708 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.758743 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.758831 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.758872 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.758916 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759047 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759134 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759231 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759271 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759234 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759357 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.759461 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767464 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767664 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767750 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767823 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767896 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.767980 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768062 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768249 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768291 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768370 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.768445 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.770796 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.771617 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.772205 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.772459 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.772571 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.774107 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-76msc"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.775339 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.776224 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.776312 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.777101 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wt5n6"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.777785 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.777903 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.778395 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.779010 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-kqxdj"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.779355 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.781864 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.782198 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.782646 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.783264 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.785430 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-st7xh"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.786042 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gqw4d"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.786142 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.787113 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.787422 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.787734 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.787879 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.794184 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.795000 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.797322 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.797885 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.798314 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.798685 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.799601 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.800155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.801035 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.801054 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.801766 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.804967 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.805421 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vhtwk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.806474 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.807026 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.807904 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rm2x4"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.808405 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.808915 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.809564 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.811648 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.813220 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtqqc"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.820677 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.827320 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.827343 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c8f8r"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.829101 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.829749 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.830921 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.833142 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lq5x8"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.834103 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.836151 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hds2c"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.837841 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-76msc"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.837945 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.838800 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lz46c"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.840602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.841385 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.842982 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.845463 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.846627 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-st7xh"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.847872 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.849365 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.850363 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.852449 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.854646 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.854704 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855142 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-webhook-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-config\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqvf5\" (UniqueName: \"kubernetes.io/projected/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-kube-api-access-jqvf5\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855228 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-config\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-config\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855261 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q44nt\" (UniqueName: \"kubernetes.io/projected/b9ff0d92-ab2f-4815-9659-7b4507d64344-kube-api-access-q44nt\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-encryption-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5604c165-492d-4736-848c-254474834852-audit-dir\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9mq6\" (UniqueName: \"kubernetes.io/projected/544dd01f-d91c-4195-b649-8d2aa5a54c49-kube-api-access-s9mq6\") pod \"migrator-59844c95c7-tz5gv\" (UID: \"544dd01f-d91c-4195-b649-8d2aa5a54c49\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855336 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-trusted-ca\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855370 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855388 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfe812a6-0f2d-49a3-ba8b-9af722589906-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855404 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855429 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20fcc857-dc4e-43a1-83c7-5191075fe805-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855449 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvkbc\" (UniqueName: \"kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855486 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855516 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a079205-7ea0-45ab-aecf-7944fd65888c-proxy-tls\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855582 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-client\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-apiservice-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8a7d50-89ba-4804-8ad8-ae427909d60e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-srv-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855677 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfe812a6-0f2d-49a3-ba8b-9af722589906-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855699 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zhx\" (UniqueName: \"kubernetes.io/projected/bf8a7d50-89ba-4804-8ad8-ae427909d60e-kube-api-access-t4zhx\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855716 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79145c71-ecbd-4434-ad66-bb1dc84facff-service-ca-bundle\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit-dir\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855752 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8db4\" (UniqueName: \"kubernetes.io/projected/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-kube-api-access-v8db4\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855771 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkjxv\" (UniqueName: \"kubernetes.io/projected/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-kube-api-access-tkjxv\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855791 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6bb7\" (UniqueName: \"kubernetes.io/projected/20fcc857-dc4e-43a1-83c7-5191075fe805-kube-api-access-l6bb7\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855811 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37cd1cb-47b4-47da-9457-6ece58cecdb8-config\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855864 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-etcd-client\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855916 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855934 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqt8\" (UniqueName: \"kubernetes.io/projected/6cdbc50d-e4fc-4118-9226-657c5103f97d-kube-api-access-qcqt8\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.855953 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-metrics-tls\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856007 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2f72\" (UniqueName: \"kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856029 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xrkb\" (UniqueName: \"kubernetes.io/projected/c8f89cbd-cef8-468a-973d-6e513dcb4e09-kube-api-access-4xrkb\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856051 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-stats-auth\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856091 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzs4\" (UniqueName: \"kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856144 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856161 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e609cc9-2f55-41ad-8234-f57ef9928b69-serving-cert\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pzhf\" (UniqueName: \"kubernetes.io/projected/5604c165-492d-4736-848c-254474834852-kube-api-access-9pzhf\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856197 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856213 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63cd790-9ef7-4a09-b132-e2f85e4310ce-serving-cert\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-serving-cert\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856317 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fht5h\" (UniqueName: \"kubernetes.io/projected/b7334cd7-95ed-4eef-a166-ea13a8a59382-kube-api-access-fht5h\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856331 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgk2\" (UniqueName: \"kubernetes.io/projected/0feb4ea5-f6ce-42ca-9119-5452ba323af2-kube-api-access-2hgk2\") pod \"downloads-7954f5f757-kqxdj\" (UID: \"0feb4ea5-f6ce-42ca-9119-5452ba323af2\") " pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856363 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzpms\" (UniqueName: \"kubernetes.io/projected/abaafafa-8c2c-497e-9c74-f88fc3fddee7-kube-api-access-vzpms\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856398 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-image-import-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff0d92-ab2f-4815-9659-7b4507d64344-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856442 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9r6h\" (UniqueName: \"kubernetes.io/projected/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-kube-api-access-n9r6h\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856457 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5jx\" (UniqueName: \"kubernetes.io/projected/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-kube-api-access-5m5jx\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856471 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwnt\" (UniqueName: \"kubernetes.io/projected/7a5f4368-709e-47c5-8a7c-669dc97e78c5-kube-api-access-kgwnt\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-images\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856560 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a37cd1cb-47b4-47da-9457-6ece58cecdb8-serving-cert\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-metrics-certs\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qstfc\" (UniqueName: \"kubernetes.io/projected/79145c71-ecbd-4434-ad66-bb1dc84facff-kube-api-access-qstfc\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-tmpfs\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpsv\" (UniqueName: \"kubernetes.io/projected/9e609cc9-2f55-41ad-8234-f57ef9928b69-kube-api-access-gnpsv\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856657 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856672 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-serving-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856687 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-node-pullsecrets\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856765 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-encryption-config\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856784 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7334cd7-95ed-4eef-a166-ea13a8a59382-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856802 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856836 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9rm\" (UniqueName: \"kubernetes.io/projected/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-kube-api-access-2w9rm\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856854 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-auth-proxy-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856871 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-serving-cert\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8f89cbd-cef8-468a-973d-6e513dcb4e09-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856905 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sxtx\" (UniqueName: \"kubernetes.io/projected/a37cd1cb-47b4-47da-9457-6ece58cecdb8-kube-api-access-8sxtx\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856920 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-audit-policies\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856973 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f19636bc-ac54-4bfb-a75f-63049dd5c460-machine-approver-tls\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.856989 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-default-certificate\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857004 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaafafa-8c2c-497e-9c74-f88fc3fddee7-proxy-tls\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857020 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857037 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857056 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857071 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-serving-cert\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857088 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-images\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857103 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abaafafa-8c2c-497e-9c74-f88fc3fddee7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857118 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-srv-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857148 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857165 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857182 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857200 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n59bp\" (UniqueName: \"kubernetes.io/projected/9a079205-7ea0-45ab-aecf-7944fd65888c-kube-api-access-n59bp\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-profile-collector-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4pg\" (UniqueName: \"kubernetes.io/projected/e63cd790-9ef7-4a09-b132-e2f85e4310ce-kube-api-access-mn4pg\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgps\" (UniqueName: \"kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857264 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7334cd7-95ed-4eef-a166-ea13a8a59382-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxkv\" (UniqueName: \"kubernetes.io/projected/f19636bc-ac54-4bfb-a75f-63049dd5c460-kube-api-access-tkxkv\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857298 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857327 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe812a6-0f2d-49a3-ba8b-9af722589906-config\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.857928 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-image-import-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.858056 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-config\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.859022 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.859097 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.859662 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.859693 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.859984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.860140 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-serving-ca\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.860190 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.860238 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-node-pullsecrets\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.860420 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-config\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.862476 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-encryption-config\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.862630 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit-dir\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.863223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-config\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.864072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-encryption-config\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.864655 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.865134 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.867219 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.868553 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.868970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.869036 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f19636bc-ac54-4bfb-a75f-63049dd5c460-auth-proxy-config\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.869075 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.869190 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8a7d50-89ba-4804-8ad8-ae427909d60e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.869237 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-audit-policies\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.869278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.870031 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.870234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-audit\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.870314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871068 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-m7n8r"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5604c165-492d-4736-848c-254474834852-audit-dir\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871615 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e609cc9-2f55-41ad-8234-f57ef9928b69-trusted-ca\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871771 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.871828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff0d92-ab2f-4815-9659-7b4507d64344-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.872015 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b9ff0d92-ab2f-4815-9659-7b4507d64344-images\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.872155 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.872412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.872982 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.873157 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5604c165-492d-4736-848c-254474834852-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.874138 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.875107 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kqxdj"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.875442 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e63cd790-9ef7-4a09-b132-e2f85e4310ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.876104 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.877064 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.878257 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.878284 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.878348 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-etcd-client\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.878556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.879322 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.879651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e609cc9-2f55-41ad-8234-f57ef9928b69-serving-cert\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.879689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e63cd790-9ef7-4a09-b132-e2f85e4310ce-serving-cert\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.879729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rm2x4"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.879799 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.880171 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.880315 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vhtwk"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.880614 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-serving-cert\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.881024 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.881124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.881168 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.882178 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hds2c"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.882836 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f19636bc-ac54-4bfb-a75f-63049dd5c460-machine-approver-tls\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.883013 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.883068 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-etcd-client\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.883610 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lsr"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.883984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.885090 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.885214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.886004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5604c165-492d-4736-848c-254474834852-serving-cert\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.889833 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qzbgl"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.890481 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.891380 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qzbgl"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.892464 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lsr"] Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.906686 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.920825 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.940131 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqt8\" (UniqueName: \"kubernetes.io/projected/6cdbc50d-e4fc-4118-9226-657c5103f97d-kube-api-access-qcqt8\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-metrics-tls\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2f72\" (UniqueName: \"kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958702 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xrkb\" (UniqueName: \"kubernetes.io/projected/c8f89cbd-cef8-468a-973d-6e513dcb4e09-kube-api-access-4xrkb\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-stats-auth\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958768 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958813 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-serving-cert\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.958970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fht5h\" (UniqueName: \"kubernetes.io/projected/b7334cd7-95ed-4eef-a166-ea13a8a59382-kube-api-access-fht5h\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959004 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzpms\" (UniqueName: \"kubernetes.io/projected/abaafafa-8c2c-497e-9c74-f88fc3fddee7-kube-api-access-vzpms\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgk2\" (UniqueName: \"kubernetes.io/projected/0feb4ea5-f6ce-42ca-9119-5452ba323af2-kube-api-access-2hgk2\") pod \"downloads-7954f5f757-kqxdj\" (UID: \"0feb4ea5-f6ce-42ca-9119-5452ba323af2\") " pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959146 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9r6h\" (UniqueName: \"kubernetes.io/projected/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-kube-api-access-n9r6h\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5jx\" (UniqueName: \"kubernetes.io/projected/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-kube-api-access-5m5jx\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwnt\" (UniqueName: \"kubernetes.io/projected/7a5f4368-709e-47c5-8a7c-669dc97e78c5-kube-api-access-kgwnt\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-images\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959265 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a37cd1cb-47b4-47da-9457-6ece58cecdb8-serving-cert\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstfc\" (UniqueName: \"kubernetes.io/projected/79145c71-ecbd-4434-ad66-bb1dc84facff-kube-api-access-qstfc\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959343 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-tmpfs\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-metrics-certs\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959550 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959592 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7334cd7-95ed-4eef-a166-ea13a8a59382-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959658 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9rm\" (UniqueName: \"kubernetes.io/projected/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-kube-api-access-2w9rm\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8f89cbd-cef8-468a-973d-6e513dcb4e09-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sxtx\" (UniqueName: \"kubernetes.io/projected/a37cd1cb-47b4-47da-9457-6ece58cecdb8-kube-api-access-8sxtx\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959757 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-default-certificate\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaafafa-8c2c-497e-9c74-f88fc3fddee7-proxy-tls\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959851 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959882 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959914 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abaafafa-8c2c-497e-9c74-f88fc3fddee7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-srv-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.959971 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n59bp\" (UniqueName: \"kubernetes.io/projected/9a079205-7ea0-45ab-aecf-7944fd65888c-kube-api-access-n59bp\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960035 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-profile-collector-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7334cd7-95ed-4eef-a166-ea13a8a59382-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe812a6-0f2d-49a3-ba8b-9af722589906-config\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960175 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-webhook-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960227 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9mq6\" (UniqueName: \"kubernetes.io/projected/544dd01f-d91c-4195-b649-8d2aa5a54c49-kube-api-access-s9mq6\") pod \"migrator-59844c95c7-tz5gv\" (UID: \"544dd01f-d91c-4195-b649-8d2aa5a54c49\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960266 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfe812a6-0f2d-49a3-ba8b-9af722589906-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20fcc857-dc4e-43a1-83c7-5191075fe805-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960343 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a079205-7ea0-45ab-aecf-7944fd65888c-proxy-tls\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960373 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-apiservice-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-srv-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfe812a6-0f2d-49a3-ba8b-9af722589906-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79145c71-ecbd-4434-ad66-bb1dc84facff-service-ca-bundle\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960613 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8db4\" (UniqueName: \"kubernetes.io/projected/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-kube-api-access-v8db4\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960646 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkjxv\" (UniqueName: \"kubernetes.io/projected/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-kube-api-access-tkjxv\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bb7\" (UniqueName: \"kubernetes.io/projected/20fcc857-dc4e-43a1-83c7-5191075fe805-kube-api-access-l6bb7\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960716 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960746 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37cd1cb-47b4-47da-9457-6ece58cecdb8-config\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.960779 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.961012 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.961217 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-images\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.961775 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-tmpfs\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.962752 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-serving-cert\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.962765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.962822 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a079205-7ea0-45ab-aecf-7944fd65888c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.963140 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7334cd7-95ed-4eef-a166-ea13a8a59382-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.963151 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.963685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abaafafa-8c2c-497e-9c74-f88fc3fddee7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.963784 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.964049 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.964282 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.964526 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.965098 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7334cd7-95ed-4eef-a166-ea13a8a59382-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.965572 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.966019 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.966717 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a079205-7ea0-45ab-aecf-7944fd65888c-proxy-tls\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.975254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaafafa-8c2c-497e-9c74-f88fc3fddee7-proxy-tls\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:29 crc kubenswrapper[4830]: I1203 22:07:29.980295 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.001544 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.020626 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.026032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-srv-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.040599 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.045097 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7a5f4368-709e-47c5-8a7c-669dc97e78c5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.045828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-profile-collector-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.060480 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.080838 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.101518 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.105937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6cdbc50d-e4fc-4118-9226-657c5103f97d-srv-cert\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.120404 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.140684 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.145962 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20fcc857-dc4e-43a1-83c7-5191075fe805-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.159913 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.179810 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.199714 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.228665 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.241832 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.261431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.280932 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.302368 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.316130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfe812a6-0f2d-49a3-ba8b-9af722589906-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.321462 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.340176 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.349578 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8f89cbd-cef8-468a-973d-6e513dcb4e09-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.361458 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.362655 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe812a6-0f2d-49a3-ba8b-9af722589906-config\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.380594 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.400641 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.404758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-stats-auth\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.421913 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.436057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-default-certificate\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.441021 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.460615 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.480309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.493778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-metrics-tls\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.501154 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.506373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79145c71-ecbd-4434-ad66-bb1dc84facff-metrics-certs\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.521420 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.540775 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.542982 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79145c71-ecbd-4434-ad66-bb1dc84facff-service-ca-bundle\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.560838 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.580759 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.603151 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.621777 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.642183 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.661299 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.684674 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.696599 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.700885 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.703677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.721681 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.727987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-webhook-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.728065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-apiservice-cert\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.740988 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.760894 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.780314 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.785926 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.801153 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.804158 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.818933 4830 request.go:700] Waited for 1.018455055s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.821007 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.841261 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.842972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37cd1cb-47b4-47da-9457-6ece58cecdb8-config\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.861795 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.867297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a37cd1cb-47b4-47da-9457-6ece58cecdb8-serving-cert\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.883011 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.901721 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.921454 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.961313 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 22:07:30 crc kubenswrapper[4830]: I1203 22:07:30.980667 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.013346 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.021616 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.040982 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.060609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.080484 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.101431 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.121358 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.140893 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.160889 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.181464 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.200828 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.220931 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.240858 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.260238 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.280404 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.302211 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.322648 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.341252 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.360408 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.381015 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.424181 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.441715 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.461434 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.481070 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.500631 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.548616 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q44nt\" (UniqueName: \"kubernetes.io/projected/b9ff0d92-ab2f-4815-9659-7b4507d64344-kube-api-access-q44nt\") pod \"machine-api-operator-5694c8668f-c8f8r\" (UID: \"b9ff0d92-ab2f-4815-9659-7b4507d64344\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.564341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpsv\" (UniqueName: \"kubernetes.io/projected/9e609cc9-2f55-41ad-8234-f57ef9928b69-kube-api-access-gnpsv\") pod \"console-operator-58897d9998-vtqqc\" (UID: \"9e609cc9-2f55-41ad-8234-f57ef9928b69\") " pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.590628 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqvf5\" (UniqueName: \"kubernetes.io/projected/76bcec30-21d9-4a72-9e84-ee3d19ea64c4-kube-api-access-jqvf5\") pod \"apiserver-76f77b778f-lq5x8\" (UID: \"76bcec30-21d9-4a72-9e84-ee3d19ea64c4\") " pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.592242 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.613818 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzs4\" (UniqueName: \"kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4\") pod \"controller-manager-879f6c89f-lv6mk\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.625246 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.631883 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zhx\" (UniqueName: \"kubernetes.io/projected/bf8a7d50-89ba-4804-8ad8-ae427909d60e-kube-api-access-t4zhx\") pod \"cluster-samples-operator-665b6dd947-ngg6q\" (UID: \"bf8a7d50-89ba-4804-8ad8-ae427909d60e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.639855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pzhf\" (UniqueName: \"kubernetes.io/projected/5604c165-492d-4736-848c-254474834852-kube-api-access-9pzhf\") pod \"apiserver-7bbb656c7d-h2wnt\" (UID: \"5604c165-492d-4736-848c-254474834852\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.661305 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgps\" (UniqueName: \"kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps\") pod \"route-controller-manager-6576b87f9c-xtxmm\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.676241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4pg\" (UniqueName: \"kubernetes.io/projected/e63cd790-9ef7-4a09-b132-e2f85e4310ce-kube-api-access-mn4pg\") pod \"authentication-operator-69f744f599-wt5n6\" (UID: \"e63cd790-9ef7-4a09-b132-e2f85e4310ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.700667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxkv\" (UniqueName: \"kubernetes.io/projected/f19636bc-ac54-4bfb-a75f-63049dd5c460-kube-api-access-tkxkv\") pod \"machine-approver-56656f9798-wf7hf\" (UID: \"f19636bc-ac54-4bfb-a75f-63049dd5c460\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.733587 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.735239 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvkbc\" (UniqueName: \"kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc\") pod \"oauth-openshift-558db77b4-lz46c\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.744861 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.761128 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.781719 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.784866 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.801096 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.814337 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.821323 4830 request.go:700] Waited for 1.933949214s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.822916 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.831934 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.842834 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.870749 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.871424 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.870877 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.871902 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.873022 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lq5x8"] Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.882480 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.893433 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c8f8r"] Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.900135 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.901368 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.909614 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.939658 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2f72\" (UniqueName: \"kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72\") pod \"console-f9d7485db-drfg4\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.948230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.955346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqt8\" (UniqueName: \"kubernetes.io/projected/6cdbc50d-e4fc-4118-9226-657c5103f97d-kube-api-access-qcqt8\") pod \"catalog-operator-68c6474976-d6h4q\" (UID: \"6cdbc50d-e4fc-4118-9226-657c5103f97d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.984888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xrkb\" (UniqueName: \"kubernetes.io/projected/c8f89cbd-cef8-468a-973d-6e513dcb4e09-kube-api-access-4xrkb\") pod \"control-plane-machine-set-operator-78cbb6b69f-c2sm5\" (UID: \"c8f89cbd-cef8-468a-973d-6e513dcb4e09\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:31 crc kubenswrapper[4830]: I1203 22:07:31.997762 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea0a070a-b4cf-4bd0-abcc-28144b64aafb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vht66\" (UID: \"ea0a070a-b4cf-4bd0-abcc-28144b64aafb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.014939 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgk2\" (UniqueName: \"kubernetes.io/projected/0feb4ea5-f6ce-42ca-9119-5452ba323af2-kube-api-access-2hgk2\") pod \"downloads-7954f5f757-kqxdj\" (UID: \"0feb4ea5-f6ce-42ca-9119-5452ba323af2\") " pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.048764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9r6h\" (UniqueName: \"kubernetes.io/projected/f197bf6d-1e49-4af0-9d6c-1df2e4fd0222-kube-api-access-n9r6h\") pod \"dns-operator-744455d44c-st7xh\" (UID: \"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222\") " pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.050138 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.058562 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5jx\" (UniqueName: \"kubernetes.io/projected/97d3467b-ec38-4f92-9bd1-17d3fbeac78a-kube-api-access-5m5jx\") pod \"openshift-apiserver-operator-796bbdcf4f-lqr6w\" (UID: \"97d3467b-ec38-4f92-9bd1-17d3fbeac78a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.058860 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.080252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwnt\" (UniqueName: \"kubernetes.io/projected/7a5f4368-709e-47c5-8a7c-669dc97e78c5-kube-api-access-kgwnt\") pod \"olm-operator-6b444d44fb-549dj\" (UID: \"7a5f4368-709e-47c5-8a7c-669dc97e78c5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.090005 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wt5n6"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.094525 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzpms\" (UniqueName: \"kubernetes.io/projected/abaafafa-8c2c-497e-9c74-f88fc3fddee7-kube-api-access-vzpms\") pod \"machine-config-controller-84d6567774-6b7zs\" (UID: \"abaafafa-8c2c-497e-9c74-f88fc3fddee7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:32 crc kubenswrapper[4830]: W1203 22:07:32.100053 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be04ea3_029e_4aab_b86c_211ef277f024.slice/crio-a5a6b713571dd3719105a9ee423caf12f528f674714f4eb1ea4712bef4080b2a WatchSource:0}: Error finding container a5a6b713571dd3719105a9ee423caf12f528f674714f4eb1ea4712bef4080b2a: Status 404 returned error can't find the container with id a5a6b713571dd3719105a9ee423caf12f528f674714f4eb1ea4712bef4080b2a Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.101602 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.115306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fht5h\" (UniqueName: \"kubernetes.io/projected/b7334cd7-95ed-4eef-a166-ea13a8a59382-kube-api-access-fht5h\") pod \"kube-storage-version-migrator-operator-b67b599dd-bl8vb\" (UID: \"b7334cd7-95ed-4eef-a166-ea13a8a59382\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.116186 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.123337 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.134797 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8db4\" (UniqueName: \"kubernetes.io/projected/af2d270a-cbe9-46d8-b720-fa9b3cbf3f07-kube-api-access-v8db4\") pod \"packageserver-d55dfcdfc-hcm9k\" (UID: \"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.144468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.150201 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.154871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6bb7\" (UniqueName: \"kubernetes.io/projected/20fcc857-dc4e-43a1-83c7-5191075fe805-kube-api-access-l6bb7\") pod \"multus-admission-controller-857f4d67dd-76msc\" (UID: \"20fcc857-dc4e-43a1-83c7-5191075fe805\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.184208 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" event={"ID":"f19636bc-ac54-4bfb-a75f-63049dd5c460","Type":"ContainerStarted","Data":"612bd7eb1337473cf2a71c2a2cda61dd5314f49d700a754e793f9d034fc8d0ff"} Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.185255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" event={"ID":"76bcec30-21d9-4a72-9e84-ee3d19ea64c4","Type":"ContainerStarted","Data":"9d65b319db857a1716a1a961ce7715f401231b6f9d2c4de728656b88557ca7f6"} Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.186999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" event={"ID":"6be04ea3-029e-4aab-b86c-211ef277f024","Type":"ContainerStarted","Data":"a5a6b713571dd3719105a9ee423caf12f528f674714f4eb1ea4712bef4080b2a"} Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.187990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" event={"ID":"b9ff0d92-ab2f-4815-9659-7b4507d64344","Type":"ContainerStarted","Data":"6e09bff1d69123528fa31019386d9e1bee819d1ca5a86236e25b6fe9aa0c3af2"} Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.191006 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" event={"ID":"e63cd790-9ef7-4a09-b132-e2f85e4310ce","Type":"ContainerStarted","Data":"2c806cbe3eb1ae3fbd5aed70ae4a620f1e72e18c772f529e2bcf4565f18fde97"} Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.195848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n59bp\" (UniqueName: \"kubernetes.io/projected/9a079205-7ea0-45ab-aecf-7944fd65888c-kube-api-access-n59bp\") pod \"machine-config-operator-74547568cd-p2vcn\" (UID: \"9a079205-7ea0-45ab-aecf-7944fd65888c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.213908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstfc\" (UniqueName: \"kubernetes.io/projected/79145c71-ecbd-4434-ad66-bb1dc84facff-kube-api-access-qstfc\") pod \"router-default-5444994796-gqw4d\" (UID: \"79145c71-ecbd-4434-ad66-bb1dc84facff\") " pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.233218 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfe812a6-0f2d-49a3-ba8b-9af722589906-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gtl86\" (UID: \"cfe812a6-0f2d-49a3-ba8b-9af722589906\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.239263 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.257216 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9mq6\" (UniqueName: \"kubernetes.io/projected/544dd01f-d91c-4195-b649-8d2aa5a54c49-kube-api-access-s9mq6\") pod \"migrator-59844c95c7-tz5gv\" (UID: \"544dd01f-d91c-4195-b649-8d2aa5a54c49\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.270024 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.274393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9rm\" (UniqueName: \"kubernetes.io/projected/e7368d40-f4b4-49ea-9d46-fc1cff0c4438-kube-api-access-2w9rm\") pod \"openshift-config-operator-7777fb866f-hkwdv\" (UID: \"e7368d40-f4b4-49ea-9d46-fc1cff0c4438\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.296270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sxtx\" (UniqueName: \"kubernetes.io/projected/a37cd1cb-47b4-47da-9457-6ece58cecdb8-kube-api-access-8sxtx\") pod \"service-ca-operator-777779d784-mjgm6\" (UID: \"a37cd1cb-47b4-47da-9457-6ece58cecdb8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.311675 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.357295 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.401723 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lz46c"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.412310 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.467694 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.474950 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.476855 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vtqqc"] Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.991336 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.992167 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.993634 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.994985 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.995754 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.995874 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkjxv\" (UniqueName: \"kubernetes.io/projected/ab47206e-15eb-4a18-8ca8-d8cfc7c510ff-kube-api-access-tkjxv\") pod \"openshift-controller-manager-operator-756b6f6bc6-ch42b\" (UID: \"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:32 crc kubenswrapper[4830]: I1203 22:07:32.995962 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.008523 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.009155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.010583 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.018668 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021185 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021427 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021524 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021636 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021819 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.021888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sg2j\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.026215 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.526178663 +0000 UTC m=+142.522640022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.122972 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.123444 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.623426297 +0000 UTC m=+142.619887646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124294 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zvd\" (UniqueName: \"kubernetes.io/projected/52c878ad-7067-4961-bbc2-fc794501de21-kube-api-access-m5zvd\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124312 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-node-bootstrap-token\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124341 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-config-volume\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sg2j\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5731307a-cc06-410f-bfeb-dcae600b121a-signing-key\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124480 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124545 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9brm\" (UniqueName: \"kubernetes.io/projected/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-kube-api-access-v9brm\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124606 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5731307a-cc06-410f-bfeb-dcae600b121a-signing-cabundle\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124622 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-certs\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124655 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64rb5\" (UniqueName: \"kubernetes.io/projected/d85a4e41-11b6-4a03-9ff5-6143a77915ed-kube-api-access-64rb5\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124682 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d85a4e41-11b6-4a03-9ff5-6143a77915ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124769 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0183561f-6785-4549-b38a-49de4135ef09-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124784 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0183561f-6785-4549-b38a-49de4135ef09-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124839 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jq6p\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-kube-api-access-2jq6p\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124901 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bpxz\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-kube-api-access-9bpxz\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124918 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c4588c-23f9-4e78-88cf-ced97b89403e-trusted-ca\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124936 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l754t\" (UniqueName: \"kubernetes.io/projected/5731307a-cc06-410f-bfeb-dcae600b121a-kube-api-access-l754t\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.124978 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c4588c-23f9-4e78-88cf-ced97b89403e-metrics-tls\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.125080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.125198 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c878ad-7067-4961-bbc2-fc794501de21-cert\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.128878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.129288 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.62926461 +0000 UTC m=+142.625725959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.131709 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.132893 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.135283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.147201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.150318 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sg2j\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.151237 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.196990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" event={"ID":"9e609cc9-2f55-41ad-8234-f57ef9928b69","Type":"ContainerStarted","Data":"61e41cdf78fa9c4cb991b60ca45ae541b88e09b939ff43259c0c32554f24c441"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.201328 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" event={"ID":"6cdbc50d-e4fc-4118-9226-657c5103f97d","Type":"ContainerStarted","Data":"807421a3ffe2b2c88faf100bf991aea4e541bda35beaec4030f0f0fa0c183b2a"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.205906 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" event={"ID":"aeb240dc-cbe3-4b23-b806-4296015a46f0","Type":"ContainerStarted","Data":"03ff5b4b6cc3e425faeb631733e94d0cdec582f6ace46b2c5194e2fa8917bd90"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.215742 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" event={"ID":"bf8a7d50-89ba-4804-8ad8-ae427909d60e","Type":"ContainerStarted","Data":"e3b77e4d01ed90d5bde86394220f9be80c7ed7c498abf08c96825178028ae97b"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.216976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" event={"ID":"ef1fa67c-db0a-4077-92ed-1b55beebf7c6","Type":"ContainerStarted","Data":"9d11fb4a92fc2f1328d7834697956457e1eb1bc113839f693fa2311909608ed5"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.218393 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" event={"ID":"b9ff0d92-ab2f-4815-9659-7b4507d64344","Type":"ContainerStarted","Data":"df9de5ce3ce003b7c4d9b44cb990f788703efab0f7724c3db3e91ff670b718a1"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.220912 4830 generic.go:334] "Generic (PLEG): container finished" podID="76bcec30-21d9-4a72-9e84-ee3d19ea64c4" containerID="93bc4ac9c1d652405ae4c92c518e3fbe529d8489135afe9d9a1517f0187ccfcc" exitCode=0 Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.221005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" event={"ID":"76bcec30-21d9-4a72-9e84-ee3d19ea64c4","Type":"ContainerDied","Data":"93bc4ac9c1d652405ae4c92c518e3fbe529d8489135afe9d9a1517f0187ccfcc"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.222538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" event={"ID":"5604c165-492d-4736-848c-254474834852","Type":"ContainerStarted","Data":"55b13b22335900981a9bb1a976049c1d3eae25ddaffb1b0551128ffa9e189dbd"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.223367 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drfg4" event={"ID":"9f6628e7-55ff-4c71-b3e7-102cb3b6954d","Type":"ContainerStarted","Data":"64dd6dd0745667ff722ab0da64d6ad23a6aef43bb07231f8596f4cf9249029f2"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.224990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" event={"ID":"f19636bc-ac54-4bfb-a75f-63049dd5c460","Type":"ContainerStarted","Data":"4448ce29dcbc02c0a2bdecf98f4aa971d3130bbbc13a928b22de800034ccf2b4"} Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.225756 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.225961 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64rb5\" (UniqueName: \"kubernetes.io/projected/d85a4e41-11b6-4a03-9ff5-6143a77915ed-kube-api-access-64rb5\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.225996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d85a4e41-11b6-4a03-9ff5-6143a77915ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226033 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0183561f-6785-4549-b38a-49de4135ef09-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0183561f-6785-4549-b38a-49de4135ef09-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226078 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5x98\" (UniqueName: \"kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226094 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226111 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-registration-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.226160 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.726139265 +0000 UTC m=+142.722600614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jq6p\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-kube-api-access-2jq6p\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjp2\" (UniqueName: \"kubernetes.io/projected/240f3db0-1282-4776-88ea-4a3add0bff2b-kube-api-access-ltjp2\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226846 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bpxz\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-kube-api-access-9bpxz\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.226974 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c4588c-23f9-4e78-88cf-ced97b89403e-trusted-ca\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l754t\" (UniqueName: \"kubernetes.io/projected/5731307a-cc06-410f-bfeb-dcae600b121a-kube-api-access-l754t\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bef6234-48ae-4119-8bfb-13df57cf038d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c4588c-23f9-4e78-88cf-ced97b89403e-metrics-tls\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227694 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-metrics-tls\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z78w\" (UniqueName: \"kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-plugins-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjhz\" (UniqueName: \"kubernetes.io/projected/3e45069c-a721-4a08-8f23-37ffd520d843-kube-api-access-wqjhz\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227915 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c878ad-7067-4961-bbc2-fc794501de21-cert\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227957 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-service-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.227986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpltm\" (UniqueName: \"kubernetes.io/projected/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-kube-api-access-kpltm\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228020 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bef6234-48ae-4119-8bfb-13df57cf038d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228076 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-mountpoint-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zvd\" (UniqueName: \"kubernetes.io/projected/52c878ad-7067-4961-bbc2-fc794501de21-kube-api-access-m5zvd\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-node-bootstrap-token\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228475 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bef6234-48ae-4119-8bfb-13df57cf038d-config\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228515 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-config-volume\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26c4588c-23f9-4e78-88cf-ced97b89403e-trusted-ca\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228650 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-config\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.228739 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5731307a-cc06-410f-bfeb-dcae600b121a-signing-key\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.229715 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.729704125 +0000 UTC m=+142.726165474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.230795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d85a4e41-11b6-4a03-9ff5-6143a77915ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.230856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-csi-data-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.231089 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0183561f-6785-4549-b38a-49de4135ef09-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-serving-cert\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232097 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9brm\" (UniqueName: \"kubernetes.io/projected/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-kube-api-access-v9brm\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232120 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-client\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5731307a-cc06-410f-bfeb-dcae600b121a-signing-cabundle\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232208 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-certs\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.232234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-socket-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.234818 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-config-volume\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.234979 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c878ad-7067-4961-bbc2-fc794501de21-cert\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.235028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5731307a-cc06-410f-bfeb-dcae600b121a-signing-cabundle\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.236986 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-node-bootstrap-token\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.237042 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-kqxdj"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.237613 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-certs\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.239350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0183561f-6785-4549-b38a-49de4135ef09-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.243654 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5731307a-cc06-410f-bfeb-dcae600b121a-signing-key\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.247183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c4588c-23f9-4e78-88cf-ced97b89403e-metrics-tls\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.257020 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jq6p\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-kube-api-access-2jq6p\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.276498 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64rb5\" (UniqueName: \"kubernetes.io/projected/d85a4e41-11b6-4a03-9ff5-6143a77915ed-kube-api-access-64rb5\") pod \"package-server-manager-789f6589d5-rn2tk\" (UID: \"d85a4e41-11b6-4a03-9ff5-6143a77915ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.304497 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bpxz\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-kube-api-access-9bpxz\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.318660 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.320067 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l754t\" (UniqueName: \"kubernetes.io/projected/5731307a-cc06-410f-bfeb-dcae600b121a-kube-api-access-l754t\") pod \"service-ca-9c57cc56f-vhtwk\" (UID: \"5731307a-cc06-410f-bfeb-dcae600b121a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333413 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpltm\" (UniqueName: \"kubernetes.io/projected/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-kube-api-access-kpltm\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333446 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bef6234-48ae-4119-8bfb-13df57cf038d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333465 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-mountpoint-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.333523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bef6234-48ae-4119-8bfb-13df57cf038d-config\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-config\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-csi-data-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-serving-cert\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-client\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-socket-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5x98\" (UniqueName: \"kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334189 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-registration-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334250 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjp2\" (UniqueName: \"kubernetes.io/projected/240f3db0-1282-4776-88ea-4a3add0bff2b-kube-api-access-ltjp2\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334281 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-csi-data-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bef6234-48ae-4119-8bfb-13df57cf038d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.334488 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.8344653 +0000 UTC m=+142.830926749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-metrics-tls\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z78w\" (UniqueName: \"kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-plugins-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334647 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjhz\" (UniqueName: \"kubernetes.io/projected/3e45069c-a721-4a08-8f23-37ffd520d843-kube-api-access-wqjhz\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.334671 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-service-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.335110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zvd\" (UniqueName: \"kubernetes.io/projected/52c878ad-7067-4961-bbc2-fc794501de21-kube-api-access-m5zvd\") pod \"ingress-canary-qzbgl\" (UID: \"52c878ad-7067-4961-bbc2-fc794501de21\") " pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.335652 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-config\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.335837 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.335845 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-service-ca\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.336220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-registration-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.336637 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-mountpoint-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.336639 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bef6234-48ae-4119-8bfb-13df57cf038d-config\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.337800 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.337903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-socket-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.338033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.338547 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.337318 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240f3db0-1282-4776-88ea-4a3add0bff2b-plugins-dir\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.341863 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-etcd-client\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.343557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e45069c-a721-4a08-8f23-37ffd520d843-serving-cert\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.343557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bef6234-48ae-4119-8bfb-13df57cf038d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.345425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.346899 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.350570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-metrics-tls\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.367638 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.381687 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.396397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26c4588c-23f9-4e78-88cf-ced97b89403e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9wvcl\" (UID: \"26c4588c-23f9-4e78-88cf-ced97b89403e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.412811 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv"] Dec 03 22:07:33 crc kubenswrapper[4830]: W1203 22:07:33.418936 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0a070a_b4cf_4bd0_abcc_28144b64aafb.slice/crio-c5e3cdf52bf7b09d432237b61f54fadc7ad48a444754485616b8692a06f4421d WatchSource:0}: Error finding container c5e3cdf52bf7b09d432237b61f54fadc7ad48a444754485616b8692a06f4421d: Status 404 returned error can't find the container with id c5e3cdf52bf7b09d432237b61f54fadc7ad48a444754485616b8692a06f4421d Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.419383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9brm\" (UniqueName: \"kubernetes.io/projected/a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5-kube-api-access-v9brm\") pod \"machine-config-server-m7n8r\" (UID: \"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5\") " pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.419454 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0183561f-6785-4549-b38a-49de4135ef09-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-nmglf\" (UID: \"0183561f-6785-4549-b38a-49de4135ef09\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.434846 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.435651 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.436135 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:33.936124378 +0000 UTC m=+142.932585717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.455014 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qzbgl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.461377 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z78w\" (UniqueName: \"kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w\") pod \"collect-profiles-29413320-f4qrr\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.468156 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpltm\" (UniqueName: \"kubernetes.io/projected/cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd-kube-api-access-kpltm\") pod \"dns-default-hds2c\" (UID: \"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd\") " pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.482807 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjp2\" (UniqueName: \"kubernetes.io/projected/240f3db0-1282-4776-88ea-4a3add0bff2b-kube-api-access-ltjp2\") pod \"csi-hostpathplugin-96lsr\" (UID: \"240f3db0-1282-4776-88ea-4a3add0bff2b\") " pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.502150 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.514539 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjhz\" (UniqueName: \"kubernetes.io/projected/3e45069c-a721-4a08-8f23-37ffd520d843-kube-api-access-wqjhz\") pod \"etcd-operator-b45778765-rm2x4\" (UID: \"3e45069c-a721-4a08-8f23-37ffd520d843\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.516973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bef6234-48ae-4119-8bfb-13df57cf038d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tkdln\" (UID: \"6bef6234-48ae-4119-8bfb-13df57cf038d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.519559 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.529499 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-st7xh"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.534228 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5x98\" (UniqueName: \"kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98\") pod \"marketplace-operator-79b997595-jbhf7\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.536849 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.537102 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.037084757 +0000 UTC m=+143.033546106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.597935 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.638227 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.638574 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.13856258 +0000 UTC m=+143.135023929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.671356 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.688597 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.697726 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.700808 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.711975 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.715468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-m7n8r" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.739638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.739810 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.239794926 +0000 UTC m=+143.236256275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.740124 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.740427 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.240418144 +0000 UTC m=+143.236879493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.740988 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.742648 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-76msc"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.841265 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.841360 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.341345402 +0000 UTC m=+143.337806751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.841781 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.842016 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.34200822 +0000 UTC m=+143.338469559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.937949 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn"] Dec 03 22:07:33 crc kubenswrapper[4830]: I1203 22:07:33.948830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:33 crc kubenswrapper[4830]: E1203 22:07:33.949078 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.449063329 +0000 UTC m=+143.445524678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.051262 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.052831 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.552818516 +0000 UTC m=+143.549279865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.085398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.087925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.122831 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.123082 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.141940 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.144248 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.176411 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.176887 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.676871499 +0000 UTC m=+143.673332848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.181394 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vhtwk"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.182673 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.192703 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.257902 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kqxdj" event={"ID":"0feb4ea5-f6ce-42ca-9119-5452ba323af2","Type":"ContainerStarted","Data":"6b7767aa0202e0ba308b4a4896f2dad0a98a14d8fb6ceeb12b34451946f1afdf"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.257951 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-kqxdj" event={"ID":"0feb4ea5-f6ce-42ca-9119-5452ba323af2","Type":"ContainerStarted","Data":"f2b9ba78e80af07a0b7b3486930c532b2719bbd91d935b92878ba8aa6486b5cd"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.258299 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.259154 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-kqxdj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.259191 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kqxdj" podUID="0feb4ea5-f6ce-42ca-9119-5452ba323af2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.260751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" event={"ID":"20fcc857-dc4e-43a1-83c7-5191075fe805","Type":"ContainerStarted","Data":"429432d52eded27d92d2de1951ad596b88ccf0a4d30808aeddae0a0d9450aeb2"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.270493 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" event={"ID":"c8f89cbd-cef8-468a-973d-6e513dcb4e09","Type":"ContainerStarted","Data":"116298e21132d9b6e87818fecdf6f0ff084c4e5784a2acb35a9167a817a61390"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.273761 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" event={"ID":"bf8a7d50-89ba-4804-8ad8-ae427909d60e","Type":"ContainerStarted","Data":"6beb23f6c737a3b143e3f9cca7f564187597eb642858209b8eca6d31990e2072"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.277837 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.278143 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.778131807 +0000 UTC m=+143.774593146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.288206 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drfg4" event={"ID":"9f6628e7-55ff-4c71-b3e7-102cb3b6954d","Type":"ContainerStarted","Data":"1e734944c3a0ac7aef3936378640e5338f592ef96fa9bf6958cbe649667ea342"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.295538 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-kqxdj" podStartSLOduration=124.295519612 podStartE2EDuration="2m4.295519612s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:34.293103105 +0000 UTC m=+143.289564454" watchObservedRunningTime="2025-12-03 22:07:34.295519612 +0000 UTC m=+143.291980961" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.307796 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" event={"ID":"ea0a070a-b4cf-4bd0-abcc-28144b64aafb","Type":"ContainerStarted","Data":"c5e3cdf52bf7b09d432237b61f54fadc7ad48a444754485616b8692a06f4421d"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.309362 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" event={"ID":"ef1fa67c-db0a-4077-92ed-1b55beebf7c6","Type":"ContainerStarted","Data":"abec3c7aab5040ef27c81bba8b6b8cb63594e64d72337a86748fa59dc16e7c9d"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.309622 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qzbgl"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.309866 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.314875 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" event={"ID":"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222","Type":"ContainerStarted","Data":"2751e52ecf569318345aa8383e13bfb82dffca3e52a7be8b7e73c2ea053dd2f7"} Dec 03 22:07:34 crc kubenswrapper[4830]: W1203 22:07:34.324383 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7334cd7_95ed_4eef_a166_ea13a8a59382.slice/crio-af6b7733aeff4a4e657c54100e234b102dbbada17e92f4954f0719a83a308d99 WatchSource:0}: Error finding container af6b7733aeff4a4e657c54100e234b102dbbada17e92f4954f0719a83a308d99: Status 404 returned error can't find the container with id af6b7733aeff4a4e657c54100e234b102dbbada17e92f4954f0719a83a308d99 Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.325300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" event={"ID":"aeb240dc-cbe3-4b23-b806-4296015a46f0","Type":"ContainerStarted","Data":"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.325487 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.345647 4830 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lz46c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.345700 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.365605 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" event={"ID":"e63cd790-9ef7-4a09-b132-e2f85e4310ce","Type":"ContainerStarted","Data":"535174a8e4298a44ee585f0bc6f10cdb254c2d160848b0f65e88f65ec58f9d02"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.378844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.378937 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" podStartSLOduration=124.378921911 podStartE2EDuration="2m4.378921911s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:34.377807899 +0000 UTC m=+143.374269248" watchObservedRunningTime="2025-12-03 22:07:34.378921911 +0000 UTC m=+143.375383260" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.379489 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.879474886 +0000 UTC m=+143.875936235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.396204 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.407983 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" event={"ID":"b9ff0d92-ab2f-4815-9659-7b4507d64344","Type":"ContainerStarted","Data":"1cac1071d4dcd182560c209632802cbc20b1e2c3e4c306de42aba4f8b2edbed8"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.415127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gqw4d" event={"ID":"79145c71-ecbd-4434-ad66-bb1dc84facff","Type":"ContainerStarted","Data":"07a0446e773a21a5eed6ec827ae750a3c85ad81bc778e198f54213c0fc726bd5"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.436863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" event={"ID":"6cdbc50d-e4fc-4118-9226-657c5103f97d","Type":"ContainerStarted","Data":"96da8e5e33d46a7dbefaf47a434cfb9bd19f80118f59d5e3c78b011d5e16b260"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.437322 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.455900 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.467694 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" event={"ID":"6be04ea3-029e-4aab-b86c-211ef277f024","Type":"ContainerStarted","Data":"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.469073 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.481094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.486867 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.489748 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:34.989730564 +0000 UTC m=+143.986191913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.493576 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" event={"ID":"97d3467b-ec38-4f92-9bd1-17d3fbeac78a","Type":"ContainerStarted","Data":"9959d404577efd3aa5468ff06ecf416de160a6a735738163e624b2017d3b5ad9"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.514956 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" event={"ID":"f19636bc-ac54-4bfb-a75f-63049dd5c460","Type":"ContainerStarted","Data":"2ebb35f5b5c9d43c4f394697deff9ebe67a1d136e5767fb69a5a6c0d5cfcf170"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.523634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" event={"ID":"e7368d40-f4b4-49ea-9d46-fc1cff0c4438","Type":"ContainerStarted","Data":"1c712127455d69e0d35bc1279615b82f8e2654d2ccc108e5c82279acd43b0057"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.541989 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.546710 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl"] Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.560061 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" event={"ID":"9a079205-7ea0-45ab-aecf-7944fd65888c","Type":"ContainerStarted","Data":"78a5b0a3e2e1df7bc74b33d9f586b7f971f6fa47373b31f758cfc70775252d1a"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.563669 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" event={"ID":"abaafafa-8c2c-497e-9c74-f88fc3fddee7","Type":"ContainerStarted","Data":"8544e79f2fbbaaae9b28655a3a7f2e6f948fde531eac4f3ec03a118f3eee9460"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.566106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" event={"ID":"9e609cc9-2f55-41ad-8234-f57ef9928b69","Type":"ContainerStarted","Data":"b4824fcd8e01f0eaa62ce23e9903d1281cae76f5fdcc539903ff370c78f72652"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.567506 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.583327 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.583839 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.083810911 +0000 UTC m=+144.080272260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.634748 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-m7n8r" event={"ID":"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5","Type":"ContainerStarted","Data":"cf83e004d4b7ad5c9ad23da6aacf8b4f8851f5c7e87dd0b02bec5dcde6e13d72"} Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.670620 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" podStartSLOduration=123.670602544 podStartE2EDuration="2m3.670602544s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:34.670037298 +0000 UTC m=+143.666498647" watchObservedRunningTime="2025-12-03 22:07:34.670602544 +0000 UTC m=+143.667063893" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.685586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.685867 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.1858555 +0000 UTC m=+144.182316849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.786895 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.787062 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.287039875 +0000 UTC m=+144.283501224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.787179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.787557 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.287544819 +0000 UTC m=+144.284006168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.873064 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-drfg4" podStartSLOduration=124.873030496 podStartE2EDuration="2m4.873030496s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:34.871977297 +0000 UTC m=+143.868438646" watchObservedRunningTime="2025-12-03 22:07:34.873030496 +0000 UTC m=+143.869491845" Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.903027 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.903275 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.40324933 +0000 UTC m=+144.399710679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.903434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:34 crc kubenswrapper[4830]: E1203 22:07:34.903828 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.403821185 +0000 UTC m=+144.400282534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:34 crc kubenswrapper[4830]: I1203 22:07:34.972778 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rm2x4"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.005117 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.005464 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.505449833 +0000 UTC m=+144.501911182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.108236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.108709 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.608693255 +0000 UTC m=+144.605154604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.112796 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" podStartSLOduration=125.11278407 podStartE2EDuration="2m5.11278407s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.111140784 +0000 UTC m=+144.107602133" watchObservedRunningTime="2025-12-03 22:07:35.11278407 +0000 UTC m=+144.109245409" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.157141 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" podStartSLOduration=124.157123837 podStartE2EDuration="2m4.157123837s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.154556775 +0000 UTC m=+144.151018124" watchObservedRunningTime="2025-12-03 22:07:35.157123837 +0000 UTC m=+144.153585186" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.212069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.212636 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.712620817 +0000 UTC m=+144.709082166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.227303 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wf7hf" podStartSLOduration=125.227282576 podStartE2EDuration="2m5.227282576s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.184711827 +0000 UTC m=+144.181173176" watchObservedRunningTime="2025-12-03 22:07:35.227282576 +0000 UTC m=+144.223743925" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.271510 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wt5n6" podStartSLOduration=125.27149118 podStartE2EDuration="2m5.27149118s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.229792936 +0000 UTC m=+144.226254285" watchObservedRunningTime="2025-12-03 22:07:35.27149118 +0000 UTC m=+144.267952539" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.302108 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d6h4q" podStartSLOduration=124.302090305 podStartE2EDuration="2m4.302090305s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.300035837 +0000 UTC m=+144.296497186" watchObservedRunningTime="2025-12-03 22:07:35.302090305 +0000 UTC m=+144.298551654" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.321380 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.322468 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.822450703 +0000 UTC m=+144.818912042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.415519 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vtqqc" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.469589 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-c8f8r" podStartSLOduration=124.469500219 podStartE2EDuration="2m4.469500219s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.387352186 +0000 UTC m=+144.383813535" watchObservedRunningTime="2025-12-03 22:07:35.469500219 +0000 UTC m=+144.465961568" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.477334 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.477730 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:35.977715948 +0000 UTC m=+144.974177297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.583572 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.583854 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.083841392 +0000 UTC m=+145.080302741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.628229 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hds2c"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.685200 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.685785 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.185764377 +0000 UTC m=+145.182225726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.737411 4830 generic.go:334] "Generic (PLEG): container finished" podID="e7368d40-f4b4-49ea-9d46-fc1cff0c4438" containerID="a6bdc71c7eb5c10a521e5d95bb799a8c7d1825609328012680d4d58c78ebb935" exitCode=0 Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.737447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" event={"ID":"e7368d40-f4b4-49ea-9d46-fc1cff0c4438","Type":"ContainerDied","Data":"a6bdc71c7eb5c10a521e5d95bb799a8c7d1825609328012680d4d58c78ebb935"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.755134 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" event={"ID":"ea0a070a-b4cf-4bd0-abcc-28144b64aafb","Type":"ContainerStarted","Data":"320187e617f3d3ec616b5f3559e5b60c920df51f6fc480dafa86598d1f3be14b"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.761902 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.763647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.779250 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.787205 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.787497 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.287485986 +0000 UTC m=+145.283947335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.789229 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.800088 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" event={"ID":"abaafafa-8c2c-497e-9c74-f88fc3fddee7","Type":"ContainerStarted","Data":"fd1f20def15f625c12d8954b8c05e0b8c0dfa5335dbce456debac420f623a3fa"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.849501 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vht66" podStartSLOduration=124.849454026 podStartE2EDuration="2m4.849454026s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.812374611 +0000 UTC m=+144.808835960" watchObservedRunningTime="2025-12-03 22:07:35.849454026 +0000 UTC m=+144.845915375" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.855215 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.871821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" event={"ID":"3e45069c-a721-4a08-8f23-37ffd520d843","Type":"ContainerStarted","Data":"faa84ae685ff001d0a1296ec040b8892b3c8dcef28ab045ce3d4ac90373144aa"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.873463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" event={"ID":"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff","Type":"ContainerStarted","Data":"08d04e7a22db5db7ac30c9df8dac8a881adb542a942ec22db8d0fc5137dfaff6"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.888901 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.889189 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g85\" (UniqueName: \"kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.889220 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.889298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.900596 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.400557054 +0000 UTC m=+145.397018403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.903032 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.904406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gqw4d" event={"ID":"79145c71-ecbd-4434-ad66-bb1dc84facff","Type":"ContainerStarted","Data":"6d30719a97604959eaf88c0e0d32445f4b4a58c6d7bed049e5565a31cf959a4c"} Dec 03 22:07:35 crc kubenswrapper[4830]: E1203 22:07:35.916382 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.416360344 +0000 UTC m=+145.412821683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.952846 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gqw4d" podStartSLOduration=124.952810002 podStartE2EDuration="2m4.952810002s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:35.935809628 +0000 UTC m=+144.932270977" watchObservedRunningTime="2025-12-03 22:07:35.952810002 +0000 UTC m=+144.949271351" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.970283 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.971231 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.971510 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" event={"ID":"97d3467b-ec38-4f92-9bd1-17d3fbeac78a","Type":"ContainerStarted","Data":"dd63c39cc2f2fe9f16b8732bdfbdb93efabde70e357bfe9850af366814ecb6dc"} Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.974527 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 22:07:35 crc kubenswrapper[4830]: I1203 22:07:35.989852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lsr"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.011662 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.019138 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.019976 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020090 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhdqc\" (UniqueName: \"kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020153 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g85\" (UniqueName: \"kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020222 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.020337 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.020449 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.520433971 +0000 UTC m=+145.516895310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.028745 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:36 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:36 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:36 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.028790 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.039072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.039891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.061888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" event={"ID":"544dd01f-d91c-4195-b649-8d2aa5a54c49","Type":"ContainerStarted","Data":"7a3ae410b736022dc4b740e665cf4bbacbaeb94b27612d50a04b1059557a6f4a"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.061943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" event={"ID":"544dd01f-d91c-4195-b649-8d2aa5a54c49","Type":"ContainerStarted","Data":"4532c5a7e7b942f9bb3f62b42d4635fefda5a749355e58ffa091d83597bbe46a"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.062299 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqr6w" podStartSLOduration=126.062283299 podStartE2EDuration="2m6.062283299s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.04367288 +0000 UTC m=+145.040134239" watchObservedRunningTime="2025-12-03 22:07:36.062283299 +0000 UTC m=+145.058744648" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.062487 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.072822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" event={"ID":"26c4588c-23f9-4e78-88cf-ced97b89403e","Type":"ContainerStarted","Data":"a2874f13d85a01b668dbf489a1e06e8ea869430d861ec6b73daa9b8c87797d4e"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.090654 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" event={"ID":"a37cd1cb-47b4-47da-9457-6ece58cecdb8","Type":"ContainerStarted","Data":"6825993435ec8aca3a1c44d2569a68b490b3d74e54d9b15ce2760f0c4aee110e"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.100773 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g85\" (UniqueName: \"kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85\") pod \"certified-operators-mcjzb\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.110894 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.121294 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhdqc\" (UniqueName: \"kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.121359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.121400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.121418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.121912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.148843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" event={"ID":"b7334cd7-95ed-4eef-a166-ea13a8a59382","Type":"ContainerStarted","Data":"af6b7733aeff4a4e657c54100e234b102dbbada17e92f4954f0719a83a308d99"} Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.150816 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.6508042 +0000 UTC m=+145.647265549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.151056 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.170568 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.173185 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.220588 4830 generic.go:334] "Generic (PLEG): container finished" podID="5604c165-492d-4736-848c-254474834852" containerID="2fa764b066c3b916fdd74c5a20a1293a73de1d34794f7e24d0422f274dc86383" exitCode=0 Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.220965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" event={"ID":"5604c165-492d-4736-848c-254474834852","Type":"ContainerDied","Data":"2fa764b066c3b916fdd74c5a20a1293a73de1d34794f7e24d0422f274dc86383"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.225069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.225482 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.725461375 +0000 UTC m=+145.721922724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.245696 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" event={"ID":"c8f89cbd-cef8-468a-973d-6e513dcb4e09","Type":"ContainerStarted","Data":"dbc5dbf8a05b91739ca3e131eecd5e71ca326c219e056e1e3fd96ac713b7faa6"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.249980 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.257721 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhdqc\" (UniqueName: \"kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc\") pod \"community-operators-gxd8v\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.320845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" event={"ID":"bf8a7d50-89ba-4804-8ad8-ae427909d60e","Type":"ContainerStarted","Data":"a645e67ca8343807e842c27fc465b2343a3d9cbd8b5c8a5ab5300503c0ce726c"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.329977 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd4rw\" (UniqueName: \"kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.330054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.330093 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.330128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.331226 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.831213727 +0000 UTC m=+145.827675076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.365294 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c2sm5" podStartSLOduration=125.365276549 podStartE2EDuration="2m5.365276549s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.329212032 +0000 UTC m=+145.325673381" watchObservedRunningTime="2025-12-03 22:07:36.365276549 +0000 UTC m=+145.361737898" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.366046 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ngg6q" podStartSLOduration=126.36604161 podStartE2EDuration="2m6.36604161s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.364098575 +0000 UTC m=+145.360559924" watchObservedRunningTime="2025-12-03 22:07:36.36604161 +0000 UTC m=+145.362502959" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.382721 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.383642 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.389444 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.397026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" event={"ID":"d85a4e41-11b6-4a03-9ff5-6143a77915ed","Type":"ContainerStarted","Data":"618af181d37d43e4e8fbeb50d56f208a908584048dcdcc6140a2281aebb65d89"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.397062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" event={"ID":"d85a4e41-11b6-4a03-9ff5-6143a77915ed","Type":"ContainerStarted","Data":"bcfb62a8126f8c767ae62af46a316f8a7accffdb4a97645d66982b868c15e58d"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.437443 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.437595 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.437692 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd4rw\" (UniqueName: \"kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.437748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.438242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.438308 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:36.938291797 +0000 UTC m=+145.934753146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.438506 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.467425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd4rw\" (UniqueName: \"kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw\") pod \"certified-operators-p8x7k\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.503746 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-m7n8r" event={"ID":"a1a937d0-7cc0-4ecc-bce6-312c3eb5ddc5","Type":"ContainerStarted","Data":"746dd9fbd9ff049dbdaf8bfc9101389b20449b87b3cfc3b89438329d5fafe1f3"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.540021 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlvh\" (UniqueName: \"kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.540456 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.540498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.540606 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.540735 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-m7n8r" podStartSLOduration=7.540725087 podStartE2EDuration="7.540725087s" podCreationTimestamp="2025-12-03 22:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.539689768 +0000 UTC m=+145.536151117" watchObservedRunningTime="2025-12-03 22:07:36.540725087 +0000 UTC m=+145.537186436" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.540864 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.04085404 +0000 UTC m=+146.037315379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.554428 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" event={"ID":"cfe812a6-0f2d-49a3-ba8b-9af722589906","Type":"ContainerStarted","Data":"04cffabe9242c85c81e3cfef6a02d195764b86d770aecfd0bad07e77c21aadaa"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.577007 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.577343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" event={"ID":"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07","Type":"ContainerStarted","Data":"996bdedc81c5ca105c9766c06683996f6a9ca439a865fd9f8e820ee11383d58c"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.577388 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" event={"ID":"af2d270a-cbe9-46d8-b720-fa9b3cbf3f07","Type":"ContainerStarted","Data":"47b6be4ee4d5bd75fbc823bed64418e399d48477b03a28abc4eb4e1c6c4f8156"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.577753 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.602018 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.602822 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" podStartSLOduration=125.60281293 podStartE2EDuration="2m5.60281293s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.60210227 +0000 UTC m=+145.598563609" watchObservedRunningTime="2025-12-03 22:07:36.60281293 +0000 UTC m=+145.599274279" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.621233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" event={"ID":"76bcec30-21d9-4a72-9e84-ee3d19ea64c4","Type":"ContainerStarted","Data":"81246841564da39b21f5249a1600133d4f39f40ab303c750c92cae170126dd58"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.622590 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" event={"ID":"7a5f4368-709e-47c5-8a7c-669dc97e78c5","Type":"ContainerStarted","Data":"7202308fddb0cf05cf1b10de6cce6a8786ae79a1c322453389f367e655be1422"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.623404 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.638342 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.639145 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.639184 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.642110 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.642470 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlvh\" (UniqueName: \"kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.642604 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.642662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.643874 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.143846106 +0000 UTC m=+146.140307455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.647119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.652036 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.682481 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" podStartSLOduration=126.682467024 podStartE2EDuration="2m6.682467024s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.679320657 +0000 UTC m=+145.675782006" watchObservedRunningTime="2025-12-03 22:07:36.682467024 +0000 UTC m=+145.678928373" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.719335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlvh\" (UniqueName: \"kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh\") pod \"community-operators-9k72v\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.731933 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" event={"ID":"9a079205-7ea0-45ab-aecf-7944fd65888c","Type":"ContainerStarted","Data":"a124e40fa477d5436f1b682cf753aa73f64e7ea085be5411bb2166cafcb2c959"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.745303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.749046 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.249030303 +0000 UTC m=+146.245491652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.755033 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.763838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" event={"ID":"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222","Type":"ContainerStarted","Data":"f7bec60d6c6ef5aff21f0878c922a20f3c10271b45f363698740281d888228f4"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.779026 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" event={"ID":"0183561f-6785-4549-b38a-49de4135ef09","Type":"ContainerStarted","Data":"2fab25fb2201c65311f7f8242aac3674e940cabf7d548e12f2e5b37c232ecece"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.793869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qzbgl" event={"ID":"52c878ad-7067-4961-bbc2-fc794501de21","Type":"ContainerStarted","Data":"1dc35a4c33f4449a6bcd036820f356519de18efcb3781d25b1566b40776f9f94"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.846644 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.846792 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.346766971 +0000 UTC m=+146.343228320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.846924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.847180 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.347167633 +0000 UTC m=+146.343628982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.883384 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" podStartSLOduration=125.883369694 podStartE2EDuration="2m5.883369694s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.718588323 +0000 UTC m=+145.715049672" watchObservedRunningTime="2025-12-03 22:07:36.883369694 +0000 UTC m=+145.879831033" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.889742 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" event={"ID":"5731307a-cc06-410f-bfeb-dcae600b121a","Type":"ContainerStarted","Data":"9422c5fc99ae1a51085f2fb357251c76fb7670d4ec0eab25084b97d61ae1e9e4"} Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.900206 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-kqxdj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.900270 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-kqxdj" podUID="0feb4ea5-f6ce-42ca-9119-5452ba323af2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.917884 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.952953 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:36 crc kubenswrapper[4830]: E1203 22:07:36.953992 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.453976285 +0000 UTC m=+146.450437634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.982633 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qzbgl" podStartSLOduration=7.982617314 podStartE2EDuration="7.982617314s" podCreationTimestamp="2025-12-03 22:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:36.971744141 +0000 UTC m=+145.968205490" watchObservedRunningTime="2025-12-03 22:07:36.982617314 +0000 UTC m=+145.979078663" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.993590 4830 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lq5x8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]log ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]etcd ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/max-in-flight-filter ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 22:07:36 crc kubenswrapper[4830]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 22:07:36 crc kubenswrapper[4830]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-startinformers ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 22:07:36 crc kubenswrapper[4830]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 22:07:36 crc kubenswrapper[4830]: livez check failed Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.993647 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" podUID="76bcec30-21d9-4a72-9e84-ee3d19ea64c4" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:36 crc kubenswrapper[4830]: I1203 22:07:36.983365 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.029810 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:37 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:37 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:37 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.029919 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.047251 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hcm9k" Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.063761 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.085215 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.585199609 +0000 UTC m=+146.581660958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.148994 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" podStartSLOduration=127.148965268 podStartE2EDuration="2m7.148965268s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:37.148950718 +0000 UTC m=+146.145412067" watchObservedRunningTime="2025-12-03 22:07:37.148965268 +0000 UTC m=+146.145426617" Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.178264 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.178636 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.678620377 +0000 UTC m=+146.675081726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.279244 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.279815 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.779803732 +0000 UTC m=+146.776265071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.383930 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.384207 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.884192916 +0000 UTC m=+146.880654265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.461668 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" podStartSLOduration=126.461651709 podStartE2EDuration="2m6.461651709s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:37.397703593 +0000 UTC m=+146.394164942" watchObservedRunningTime="2025-12-03 22:07:37.461651709 +0000 UTC m=+146.458113058" Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.487402 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.487971 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:37.987959913 +0000 UTC m=+146.984421262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.595027 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.595395 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.095380073 +0000 UTC m=+147.091841422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.674430 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.704384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.704756 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.204744106 +0000 UTC m=+147.201205455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.739339 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.774238 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.805706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.806088 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.306073105 +0000 UTC m=+147.302534454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: W1203 22:07:37.822758 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae91290_b6c1_4aab_a373_dec2848c94db.slice/crio-a31e9a60f12770909772f0d5ab8aa9e83d8e867630c7a1aaca3cdcbd3a0a6e39 WatchSource:0}: Error finding container a31e9a60f12770909772f0d5ab8aa9e83d8e867630c7a1aaca3cdcbd3a0a6e39: Status 404 returned error can't find the container with id a31e9a60f12770909772f0d5ab8aa9e83d8e867630c7a1aaca3cdcbd3a0a6e39 Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.906999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:37 crc kubenswrapper[4830]: E1203 22:07:37.907328 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.407316411 +0000 UTC m=+147.403777760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.907862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-549dj" event={"ID":"7a5f4368-709e-47c5-8a7c-669dc97e78c5","Type":"ContainerStarted","Data":"921cacd39c0e80a4628b6a8bc9f11a547d9e19432ded21b39c7faf5436cac1d2"} Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.930400 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" event={"ID":"3e45069c-a721-4a08-8f23-37ffd520d843","Type":"ContainerStarted","Data":"32785fc3364d8e18880f070d85de53d09ba5616b3a8126e0931b0cc14c69dc1c"} Dec 03 22:07:37 crc kubenswrapper[4830]: I1203 22:07:37.971571 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:37.999567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.001118 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerStarted","Data":"5b123390d61460d16a0dc63a3b037169a64f71efe080f887a7487b7cc53690ff"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.003625 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.008462 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.013487 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.019397 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:38 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:38 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:38 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.019464 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.020313 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rm2x4" podStartSLOduration=127.020293595 podStartE2EDuration="2m7.020293595s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:37.997804967 +0000 UTC m=+146.994266316" watchObservedRunningTime="2025-12-03 22:07:38.020293595 +0000 UTC m=+147.016754944" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.023588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" event={"ID":"3102808a-d149-4c79-bda3-29ce37d9b96b","Type":"ContainerStarted","Data":"e487489c506620faa80fb760895bc593939a98b7f17a67b39905a709677e219b"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.023632 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" event={"ID":"3102808a-d149-4c79-bda3-29ce37d9b96b","Type":"ContainerStarted","Data":"f227c302a446ee14aab8205eb4bde80520781775e640d5042d4f6a95c21fdec5"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.027178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" event={"ID":"cfe812a6-0f2d-49a3-ba8b-9af722589906","Type":"ContainerStarted","Data":"fecfdaf875db97d8a8affaaee735038952698c7116acb407a4b2f0652d8ef27b"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.038392 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.042409 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.542383062 +0000 UTC m=+147.538844411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.055933 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" event={"ID":"544dd01f-d91c-4195-b649-8d2aa5a54c49","Type":"ContainerStarted","Data":"a2ee8bc4a7ff1516d4a8e85bfce832ea1a64736fba4cb55bd75faa6ec977e37d"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.083175 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerStarted","Data":"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.083219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerStarted","Data":"39a3b55131bfe8f32af9a43f11b717c42dde81ec31fe34f91871bd17737635d1"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.090706 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.090871 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jbhf7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.090926 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.098057 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" podStartSLOduration=128.098035286 podStartE2EDuration="2m8.098035286s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.091293048 +0000 UTC m=+147.087754397" watchObservedRunningTime="2025-12-03 22:07:38.098035286 +0000 UTC m=+147.094496635" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.116050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qzbgl" event={"ID":"52c878ad-7067-4961-bbc2-fc794501de21","Type":"ContainerStarted","Data":"41e49fd9cb18275cc674e11550c1482ebb1645534b1763766a4077d0ed4fe889"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.124692 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b65s\" (UniqueName: \"kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.124841 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.125079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.125206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.138367 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.638342952 +0000 UTC m=+147.634804301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.167290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" event={"ID":"d85a4e41-11b6-4a03-9ff5-6143a77915ed","Type":"ContainerStarted","Data":"25ff5d413e31d1e9adcfeb3eae7e24e4a310d0f5044f4bfcc34f0e6a5960c784"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.167938 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.176482 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tz5gv" podStartSLOduration=127.176464136 podStartE2EDuration="2m7.176464136s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.130876783 +0000 UTC m=+147.127338122" watchObservedRunningTime="2025-12-03 22:07:38.176464136 +0000 UTC m=+147.172925485" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.177199 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gtl86" podStartSLOduration=127.177193326 podStartE2EDuration="2m7.177193326s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.167705551 +0000 UTC m=+147.164166900" watchObservedRunningTime="2025-12-03 22:07:38.177193326 +0000 UTC m=+147.173654675" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.190129 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" event={"ID":"e7368d40-f4b4-49ea-9d46-fc1cff0c4438","Type":"ContainerStarted","Data":"d0f66a9c5e2ddefe70593fac68f6ea311478b0844a0761500c395ba062272d50"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.190755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.213739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" event={"ID":"ab47206e-15eb-4a18-8ca8-d8cfc7c510ff","Type":"ContainerStarted","Data":"feb7026382b514238eea2c17e70c98b9477ae95a293a988b02c291f1b4dc1693"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.228026 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.228278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b65s\" (UniqueName: \"kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.228314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.228422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.229234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.229463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" event={"ID":"240f3db0-1282-4776-88ea-4a3add0bff2b","Type":"ContainerStarted","Data":"38ba977b4aa7f718e8ed5de8b30905c9ce0de2fbfe8f9a17d1a28be103c79b90"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.229497 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" event={"ID":"240f3db0-1282-4776-88ea-4a3add0bff2b","Type":"ContainerStarted","Data":"93fa33d5aa3fff760fee5f1078397ac79cff0ca92515bf8ce3170d653d913048"} Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.230322 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.730304489 +0000 UTC m=+147.726765838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.230571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.235009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerStarted","Data":"a31e9a60f12770909772f0d5ab8aa9e83d8e867630c7a1aaca3cdcbd3a0a6e39"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.248352 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" podStartSLOduration=127.248325542 podStartE2EDuration="2m7.248325542s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.21204727 +0000 UTC m=+147.208508619" watchObservedRunningTime="2025-12-03 22:07:38.248325542 +0000 UTC m=+147.244786891" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.259389 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" event={"ID":"20fcc857-dc4e-43a1-83c7-5191075fe805","Type":"ContainerStarted","Data":"3cc60cf7ab1d7bf67e68fe6f5da529d8175400976d9b91ba1947724379f144a4"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.259429 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" event={"ID":"20fcc857-dc4e-43a1-83c7-5191075fe805","Type":"ContainerStarted","Data":"a8c53384726c5a82b6c3732e17d39b81bdd0261c97fa0293a2c9103a41a22688"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.286087 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ch42b" podStartSLOduration=128.286071076 podStartE2EDuration="2m8.286071076s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.285735207 +0000 UTC m=+147.282196556" watchObservedRunningTime="2025-12-03 22:07:38.286071076 +0000 UTC m=+147.282532425" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.287538 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" podStartSLOduration=127.287531647 podStartE2EDuration="2m7.287531647s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.248045355 +0000 UTC m=+147.244506714" watchObservedRunningTime="2025-12-03 22:07:38.287531647 +0000 UTC m=+147.283992996" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.291692 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b65s\" (UniqueName: \"kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s\") pod \"redhat-marketplace-j7zzm\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.299265 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" event={"ID":"76bcec30-21d9-4a72-9e84-ee3d19ea64c4","Type":"ContainerStarted","Data":"80d6ea06c6289beb894c76151153ce257b653dbe9e8ca20ac066a975cb82b2dc"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.314171 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" podStartSLOduration=128.31415428 podStartE2EDuration="2m8.31415428s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.310765845 +0000 UTC m=+147.307227194" watchObservedRunningTime="2025-12-03 22:07:38.31415428 +0000 UTC m=+147.310615629" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.319411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" event={"ID":"5604c165-492d-4736-848c-254474834852","Type":"ContainerStarted","Data":"5447db415eca0c8d6332b94f6ddf193daf1c39ca7f2f9f28edb68393596fe32f"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.329259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.329319 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.329356 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.329418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.329529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.332712 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.333143 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.83312683 +0000 UTC m=+147.829588279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.344248 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" event={"ID":"b7334cd7-95ed-4eef-a166-ea13a8a59382","Type":"ContainerStarted","Data":"e71eb36ce284a0bf81620fbed92fbd7df9942fb86597dd0d3cdf976f7966c93a"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.346257 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-76msc" podStartSLOduration=127.346240256 podStartE2EDuration="2m7.346240256s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.343950982 +0000 UTC m=+147.340412321" watchObservedRunningTime="2025-12-03 22:07:38.346240256 +0000 UTC m=+147.342701605" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.351693 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.352095 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.356753 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.357689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" event={"ID":"f197bf6d-1e49-4af0-9d6c-1df2e4fd0222","Type":"ContainerStarted","Data":"a166e371e3643c4d449d9a752ecf6142ddadc895a97e4bf2083d7060f81e9e7a"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.373127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" event={"ID":"26c4588c-23f9-4e78-88cf-ced97b89403e","Type":"ContainerStarted","Data":"0c3b3bc061e1b4642df62dce72647d1614d77c6cae32d53b19c4027878af1126"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.373170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" event={"ID":"26c4588c-23f9-4e78-88cf-ced97b89403e","Type":"ContainerStarted","Data":"9fb1e238f0b7f9d9d8ee75aade3f93b43de808a02d5f6c8a2c1e1b1ae3e8f65d"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.384363 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vhtwk" event={"ID":"5731307a-cc06-410f-bfeb-dcae600b121a","Type":"ContainerStarted","Data":"6794b153f131832cb5fc18ef907e80d02cf378d5b6eb84c2deb16eb0d2aab122"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.389132 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" podStartSLOduration=127.389115353 podStartE2EDuration="2m7.389115353s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.369391352 +0000 UTC m=+147.365852701" watchObservedRunningTime="2025-12-03 22:07:38.389115353 +0000 UTC m=+147.385576702" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.395307 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-nmglf" event={"ID":"0183561f-6785-4549-b38a-49de4135ef09","Type":"ContainerStarted","Data":"d84f7d73400c52bd5994db6e2bf32b35b821baed921ce0a66be085113eac7ba7"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.398844 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" event={"ID":"9a079205-7ea0-45ab-aecf-7944fd65888c","Type":"ContainerStarted","Data":"5da986996aba69ab7f26231eb2ca763aefcd667aae459c653618e467cce8170d"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.399269 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.413573 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" event={"ID":"abaafafa-8c2c-497e-9c74-f88fc3fddee7","Type":"ContainerStarted","Data":"deef368e606af2453d12b4016899ffd59126dd845ed87a8b615b66e3b381d1f3"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.415538 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bl8vb" podStartSLOduration=127.41551346 podStartE2EDuration="2m7.41551346s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.413296028 +0000 UTC m=+147.409757377" watchObservedRunningTime="2025-12-03 22:07:38.41551346 +0000 UTC m=+147.411974809" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.415884 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.427988 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" event={"ID":"a37cd1cb-47b4-47da-9457-6ece58cecdb8","Type":"ContainerStarted","Data":"4a7f91ce06e01a8c2174ff12be6f9c4e732f22369b6ab2a73305c5d534bf70c0"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.431075 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.431631 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.434899 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:38.93488121 +0000 UTC m=+147.931342559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.442777 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hds2c" event={"ID":"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd","Type":"ContainerStarted","Data":"a3ef3b1442c1b5b37ca1832fea39d553d5771b5974a7b6c2fa280f3f8f4f92c8"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.442814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hds2c" event={"ID":"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd","Type":"ContainerStarted","Data":"629c0d2a5feefceb414248ecd81b5297622834e0230443782901a6264b7c9ff9"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.443485 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.454790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerStarted","Data":"02aa2320e9aa4982d01f85ab026ee788946e7eb5e2b4ff7ceac0164fc9f54599"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.463704 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.472095 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.472491 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6b7zs" podStartSLOduration=127.472480321 podStartE2EDuration="2m7.472480321s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.471806332 +0000 UTC m=+147.468267671" watchObservedRunningTime="2025-12-03 22:07:38.472480321 +0000 UTC m=+147.468941670" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.473576 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-st7xh" podStartSLOduration=127.473569711 podStartE2EDuration="2m7.473569711s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.443065619 +0000 UTC m=+147.439526968" watchObservedRunningTime="2025-12-03 22:07:38.473569711 +0000 UTC m=+147.470031060" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.485975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" event={"ID":"6bef6234-48ae-4119-8bfb-13df57cf038d","Type":"ContainerStarted","Data":"eb4841399b16a33d7044bf22c9986d2bb243ebb64cb921bc8bd0b1d16063089e"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.486009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" event={"ID":"6bef6234-48ae-4119-8bfb-13df57cf038d","Type":"ContainerStarted","Data":"e5d06697d297a447f729f2f1ea8bfb9baf46f63c4abc591ae13c7c8602866eaa"} Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.534311 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.534849 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.535000 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brgk\" (UniqueName: \"kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.535050 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.540383 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.040364176 +0000 UTC m=+148.036825605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.543491 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hds2c" podStartSLOduration=9.543470492 podStartE2EDuration="9.543470492s" podCreationTimestamp="2025-12-03 22:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.515961144 +0000 UTC m=+147.512422493" watchObservedRunningTime="2025-12-03 22:07:38.543470492 +0000 UTC m=+147.539931841" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.579158 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.597139 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.611282 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.617804 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9wvcl" podStartSLOduration=127.617786208 podStartE2EDuration="2m7.617786208s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.612970833 +0000 UTC m=+147.609432192" watchObservedRunningTime="2025-12-03 22:07:38.617786208 +0000 UTC m=+147.614247557" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.638181 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.638433 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brgk\" (UniqueName: \"kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.638494 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.638548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.638918 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.638996 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.138980669 +0000 UTC m=+148.135442008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.639317 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.681570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brgk\" (UniqueName: \"kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk\") pod \"redhat-marketplace-9gkfj\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.687385 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mjgm6" podStartSLOduration=127.687369611 podStartE2EDuration="2m7.687369611s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.682851054 +0000 UTC m=+147.679312393" watchObservedRunningTime="2025-12-03 22:07:38.687369611 +0000 UTC m=+147.683830960" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.724402 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2vcn" podStartSLOduration=127.724376224 podStartE2EDuration="2m7.724376224s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.722712487 +0000 UTC m=+147.719173836" watchObservedRunningTime="2025-12-03 22:07:38.724376224 +0000 UTC m=+147.720837573" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.746190 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.746704 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.246669466 +0000 UTC m=+148.243130815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.766214 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.848103 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.848249 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.348227241 +0000 UTC m=+148.344688590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.848282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.851168 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.351158373 +0000 UTC m=+148.347619722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.954159 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:38 crc kubenswrapper[4830]: E1203 22:07:38.954459 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.454443007 +0000 UTC m=+148.450904346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.998369 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tkdln" podStartSLOduration=127.998351923 podStartE2EDuration="2m7.998351923s" podCreationTimestamp="2025-12-03 22:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:38.850764292 +0000 UTC m=+147.847225631" watchObservedRunningTime="2025-12-03 22:07:38.998351923 +0000 UTC m=+147.994813272" Dec 03 22:07:38 crc kubenswrapper[4830]: I1203 22:07:38.999384 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.000373 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.003938 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.019273 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:39 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:39 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:39 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.019315 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.043269 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hkwdv" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.055020 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.055078 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.055100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.055136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrlmz\" (UniqueName: \"kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.055416 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.555403106 +0000 UTC m=+148.551864455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.076061 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.084644 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.157818 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.158248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.158290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrlmz\" (UniqueName: \"kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.158334 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.158724 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.158784 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.658769452 +0000 UTC m=+148.655230791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.158997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.190447 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrlmz\" (UniqueName: \"kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz\") pod \"redhat-operators-nvnwt\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.260152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.260414 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.760404139 +0000 UTC m=+148.756865488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.324905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.357348 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.358366 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.362213 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.367090 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.867064417 +0000 UTC m=+148.863525766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.372008 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.468062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.468131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.468170 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.468228 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsv6v\" (UniqueName: \"kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.468573 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:39.96856086 +0000 UTC m=+148.965022209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.487919 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.495288 4830 generic.go:334] "Generic (PLEG): container finished" podID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerID="e753d5689faaa14eb1e96d006f9d07d461431388a9463468531d9678a4f40a56" exitCode=0 Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.495359 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerDied","Data":"e753d5689faaa14eb1e96d006f9d07d461431388a9463468531d9678a4f40a56"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.495401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerStarted","Data":"34f6cbad054928f62c1cb5dabceaee5faef4c3076a44347c3324f8123e66e991"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.499955 4830 generic.go:334] "Generic (PLEG): container finished" podID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerID="ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a" exitCode=0 Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.500012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerDied","Data":"ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.503020 4830 generic.go:334] "Generic (PLEG): container finished" podID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerID="3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4" exitCode=0 Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.503119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerDied","Data":"3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.511324 4830 generic.go:334] "Generic (PLEG): container finished" podID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerID="91bfdac2608ec8c9fbbf9a57dbe3f434760e2a0270d4e90bbc9214ce5268fdae" exitCode=0 Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.511382 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerDied","Data":"91bfdac2608ec8c9fbbf9a57dbe3f434760e2a0270d4e90bbc9214ce5268fdae"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.521968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3dc563ca11df443f354d685d03f6bc8f6c5ee6041075960aae5aa5bfc87ab92d"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.545136 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"16a3a0cebd3fe21988d6aad25810e73b67ab6984d025a48b8ee0b010f3e1f83c"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.560261 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerStarted","Data":"d9b94388d7cee9611a52d035b1f1f4202f32653dc528d84baa4635b866e027fb"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.570471 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.572286 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.072242705 +0000 UTC m=+149.068704054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.581784 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsv6v\" (UniqueName: \"kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.581875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.581935 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.581972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.583585 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.575193 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hds2c" event={"ID":"cb5bd6ff-d7d9-406b-9fce-dd769a87dcdd","Type":"ContainerStarted","Data":"49294ea5a87067eef729befd8a4a935ad2dd1e0648f0c3af526cd9a6c20885d4"} Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.577154 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jbhf7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.583847 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.584816 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.592379 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.092357497 +0000 UTC m=+149.088818846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.604760 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsv6v\" (UniqueName: \"kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v\") pod \"redhat-operators-tbst4\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.640705 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.684220 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.685869 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.185846287 +0000 UTC m=+149.182307636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.704964 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.786690 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.787395 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.287381622 +0000 UTC m=+149.283842971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.854157 4830 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.891888 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.892056 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.392029904 +0000 UTC m=+149.388491253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.892188 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.892547 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.392537318 +0000 UTC m=+149.388998667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:39 crc kubenswrapper[4830]: I1203 22:07:39.993916 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:39 crc kubenswrapper[4830]: E1203 22:07:39.995228 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.495212214 +0000 UTC m=+149.491673563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.015923 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.018782 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:40 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:40 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:40 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.018840 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.100169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.100475 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.600463323 +0000 UTC m=+149.596924672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.201209 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.201388 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.70135993 +0000 UTC m=+149.697821279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.201444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.201864 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.701847773 +0000 UTC m=+149.698309122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.302595 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.302759 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.80273405 +0000 UTC m=+149.799195399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.303100 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.303399 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.803386438 +0000 UTC m=+149.799847787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.409016 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.409066 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.909047918 +0000 UTC m=+149.905509267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.409438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: E1203 22:07:40.409817 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 22:07:40.909806219 +0000 UTC m=+149.906267558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qf7rn" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.504931 4830 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T22:07:39.854180557Z","Handler":null,"Name":""} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.510007 4830 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.510039 4830 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.510370 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.514688 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.609171 4830 generic.go:334] "Generic (PLEG): container finished" podID="876ff782-f899-41ad-801d-52d31854b34c" containerID="b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242" exitCode=0 Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.609286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerDied","Data":"b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.614278 4830 generic.go:334] "Generic (PLEG): container finished" podID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerID="396ff73a9172c6d5b3c0f01b72b3cdca74d52590be92d7f5c37e3a24fcc1d9e6" exitCode=0 Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.614336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerDied","Data":"396ff73a9172c6d5b3c0f01b72b3cdca74d52590be92d7f5c37e3a24fcc1d9e6"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.614361 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerStarted","Data":"af98a231067e10734fb0ad659358a9f12c1f4b2669ce405c8ba541d65c3dd5e3"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.617049 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.641669 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.641711 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.647540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" event={"ID":"240f3db0-1282-4776-88ea-4a3add0bff2b","Type":"ContainerStarted","Data":"0daf80e513df16d8d401b67db2a2d85e427f0bcd60636f85dd8963ed0e46d8f3"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.647585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" event={"ID":"240f3db0-1282-4776-88ea-4a3add0bff2b","Type":"ContainerStarted","Data":"bccee9986940c953e3bb181769d79d2b80e2937b9c023e32c72d9e044d020a9b"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.654459 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"87df5671bf99437b5138fcc9235372a3417433316c549e0cb3a9f3716c013b20"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.654527 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"debaf138791c673a56f5b81cb7c51969a9a347ac60df72e5dc56bfee0fb0ebfc"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.660803 4830 generic.go:334] "Generic (PLEG): container finished" podID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerID="36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1" exitCode=0 Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.660865 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerDied","Data":"36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.660887 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerStarted","Data":"e2a3c6cf2ce9e7e6944c0644d1bb0a9fdaedebbf93e43b4519482fd78e31b2a5"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.667379 4830 generic.go:334] "Generic (PLEG): container finished" podID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerID="fd5f2b3c4611ac9d79b92cc44d339a9848406d46362dabd718d7f8ec753cf731" exitCode=0 Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.667460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerDied","Data":"fd5f2b3c4611ac9d79b92cc44d339a9848406d46362dabd718d7f8ec753cf731"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.667483 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerStarted","Data":"b9ab4e2e0acac7f02bf97410039c6503787c8266ddcf4ddc885912699c33c0f8"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.671380 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"440f8f88fb99d590233ec3814c668cadb4b27df80d2f510b4f632507b1e49bb8"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.699343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5745c5063a42adf23a4382d9523175d31fc2884c87a11d6f597b3a31bb3e5874"} Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.699387 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.703878 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:07:40 crc kubenswrapper[4830]: I1203 22:07:40.768745 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qf7rn\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.019423 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:41 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:41 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:41 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.019747 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.025556 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.349448 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.632229 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.638401 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lq5x8" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.665976 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.848948 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" event={"ID":"2fbc6d25-674b-4886-9c9e-40971da8de89","Type":"ContainerStarted","Data":"65dc0ecb298639f91d4edb05b54bfcef89d0936609ad389318aeb987daa205be"} Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.869251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" event={"ID":"240f3db0-1282-4776-88ea-4a3add0bff2b","Type":"ContainerStarted","Data":"2775fcbb851e2a81413e8838beec4e8869a62a694fce8954c669559ed631e86e"} Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.904508 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.904594 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.904831 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-96lsr" podStartSLOduration=12.90481464 podStartE2EDuration="12.90481464s" podCreationTimestamp="2025-12-03 22:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:41.903969946 +0000 UTC m=+150.900431295" watchObservedRunningTime="2025-12-03 22:07:41.90481464 +0000 UTC m=+150.901275989" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.934634 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.959897 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.961187 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.966029 4830 patch_prober.go:28] interesting pod/console-f9d7485db-drfg4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 03 22:07:41 crc kubenswrapper[4830]: I1203 22:07:41.966121 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-drfg4" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.017533 4830 patch_prober.go:28] interesting pod/router-default-5444994796-gqw4d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:42 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Dec 03 22:07:42 crc kubenswrapper[4830]: [+]process-running ok Dec 03 22:07:42 crc kubenswrapper[4830]: healthz check failed Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.017602 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gqw4d" podUID="79145c71-ecbd-4434-ad66-bb1dc84facff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.113440 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-kqxdj" Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.901941 4830 generic.go:334] "Generic (PLEG): container finished" podID="3102808a-d149-4c79-bda3-29ce37d9b96b" containerID="e487489c506620faa80fb760895bc593939a98b7f17a67b39905a709677e219b" exitCode=0 Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.902043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" event={"ID":"3102808a-d149-4c79-bda3-29ce37d9b96b","Type":"ContainerDied","Data":"e487489c506620faa80fb760895bc593939a98b7f17a67b39905a709677e219b"} Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.905607 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" event={"ID":"2fbc6d25-674b-4886-9c9e-40971da8de89","Type":"ContainerStarted","Data":"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c"} Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.928461 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h2wnt" Dec 03 22:07:42 crc kubenswrapper[4830]: I1203 22:07:42.963903 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" podStartSLOduration=132.963884218 podStartE2EDuration="2m12.963884218s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:07:42.937607105 +0000 UTC m=+151.934068464" watchObservedRunningTime="2025-12-03 22:07:42.963884218 +0000 UTC m=+151.960345567" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.012816 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.015647 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.919775 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.932874 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gqw4d" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.944844 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.945684 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.948057 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.957029 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 22:07:43 crc kubenswrapper[4830]: I1203 22:07:43.974934 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.002135 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.002362 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.103876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.104009 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.105221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.145320 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.293310 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.399672 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.413002 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.413284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume\") pod \"3102808a-d149-4c79-bda3-29ce37d9b96b\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.413464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume\") pod \"3102808a-d149-4c79-bda3-29ce37d9b96b\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.413537 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z78w\" (UniqueName: \"kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w\") pod \"3102808a-d149-4c79-bda3-29ce37d9b96b\" (UID: \"3102808a-d149-4c79-bda3-29ce37d9b96b\") " Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.414159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3102808a-d149-4c79-bda3-29ce37d9b96b" (UID: "3102808a-d149-4c79-bda3-29ce37d9b96b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.417412 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3102808a-d149-4c79-bda3-29ce37d9b96b" (UID: "3102808a-d149-4c79-bda3-29ce37d9b96b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.418155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w" (OuterVolumeSpecName: "kube-api-access-2z78w") pod "3102808a-d149-4c79-bda3-29ce37d9b96b" (UID: "3102808a-d149-4c79-bda3-29ce37d9b96b"). InnerVolumeSpecName "kube-api-access-2z78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.516418 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3102808a-d149-4c79-bda3-29ce37d9b96b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.516469 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z78w\" (UniqueName: \"kubernetes.io/projected/3102808a-d149-4c79-bda3-29ce37d9b96b-kube-api-access-2z78w\") on node \"crc\" DevicePath \"\"" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.516479 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3102808a-d149-4c79-bda3-29ce37d9b96b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.877978 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 22:07:44 crc kubenswrapper[4830]: W1203 22:07:44.888706 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode406d9e6_5a62_411f_b7e7_bca58ce548af.slice/crio-d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc WatchSource:0}: Error finding container d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc: Status 404 returned error can't find the container with id d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.943357 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" event={"ID":"3102808a-d149-4c79-bda3-29ce37d9b96b","Type":"ContainerDied","Data":"f227c302a446ee14aab8205eb4bde80520781775e640d5042d4f6a95c21fdec5"} Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.943398 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f227c302a446ee14aab8205eb4bde80520781775e640d5042d4f6a95c21fdec5" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.943528 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr" Dec 03 22:07:44 crc kubenswrapper[4830]: I1203 22:07:44.950066 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e406d9e6-5a62-411f-b7e7-bca58ce548af","Type":"ContainerStarted","Data":"d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc"} Dec 03 22:07:45 crc kubenswrapper[4830]: I1203 22:07:45.980107 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e406d9e6-5a62-411f-b7e7-bca58ce548af","Type":"ContainerStarted","Data":"f5d32ef332ae1662f1767b5e35ba2036b10f42abe048dd254ffec17b133ac91e"} Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.491845 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 22:07:46 crc kubenswrapper[4830]: E1203 22:07:46.492046 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3102808a-d149-4c79-bda3-29ce37d9b96b" containerName="collect-profiles" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.492057 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3102808a-d149-4c79-bda3-29ce37d9b96b" containerName="collect-profiles" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.492164 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3102808a-d149-4c79-bda3-29ce37d9b96b" containerName="collect-profiles" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.492519 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.496829 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.496898 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.511030 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.655119 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.655381 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.758229 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.758289 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.758690 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.781120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:46 crc kubenswrapper[4830]: I1203 22:07:46.825361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:07:47 crc kubenswrapper[4830]: I1203 22:07:47.008255 4830 generic.go:334] "Generic (PLEG): container finished" podID="e406d9e6-5a62-411f-b7e7-bca58ce548af" containerID="f5d32ef332ae1662f1767b5e35ba2036b10f42abe048dd254ffec17b133ac91e" exitCode=0 Dec 03 22:07:47 crc kubenswrapper[4830]: I1203 22:07:47.008316 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e406d9e6-5a62-411f-b7e7-bca58ce548af","Type":"ContainerDied","Data":"f5d32ef332ae1662f1767b5e35ba2036b10f42abe048dd254ffec17b133ac91e"} Dec 03 22:07:48 crc kubenswrapper[4830]: I1203 22:07:48.715057 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hds2c" Dec 03 22:07:52 crc kubenswrapper[4830]: I1203 22:07:52.002410 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:52 crc kubenswrapper[4830]: I1203 22:07:52.009213 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:07:53 crc kubenswrapper[4830]: I1203 22:07:53.270857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:53 crc kubenswrapper[4830]: I1203 22:07:53.296339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211b6f37-bd3f-475e-b4d9-e3d94ae07c52-metrics-certs\") pod \"network-metrics-daemon-zlcmr\" (UID: \"211b6f37-bd3f-475e-b4d9-e3d94ae07c52\") " pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:53 crc kubenswrapper[4830]: I1203 22:07:53.564090 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zlcmr" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.002723 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.080751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e406d9e6-5a62-411f-b7e7-bca58ce548af","Type":"ContainerDied","Data":"d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc"} Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.080786 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d85fa905a0b25551e5dbd3e1ef281813ca10ccee5de63be775f291c4a48ed8fc" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.080791 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.093203 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access\") pod \"e406d9e6-5a62-411f-b7e7-bca58ce548af\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.093297 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir\") pod \"e406d9e6-5a62-411f-b7e7-bca58ce548af\" (UID: \"e406d9e6-5a62-411f-b7e7-bca58ce548af\") " Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.093353 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e406d9e6-5a62-411f-b7e7-bca58ce548af" (UID: "e406d9e6-5a62-411f-b7e7-bca58ce548af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.093658 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e406d9e6-5a62-411f-b7e7-bca58ce548af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.097081 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e406d9e6-5a62-411f-b7e7-bca58ce548af" (UID: "e406d9e6-5a62-411f-b7e7-bca58ce548af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:07:55 crc kubenswrapper[4830]: I1203 22:07:55.195026 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e406d9e6-5a62-411f-b7e7-bca58ce548af-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:07:56 crc kubenswrapper[4830]: I1203 22:07:56.681803 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:07:56 crc kubenswrapper[4830]: I1203 22:07:56.681933 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:08:01 crc kubenswrapper[4830]: I1203 22:08:01.033889 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:08:13 crc kubenswrapper[4830]: I1203 22:08:13.346846 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rn2tk" Dec 03 22:08:19 crc kubenswrapper[4830]: I1203 22:08:19.671571 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.161545 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 22:08:23 crc kubenswrapper[4830]: E1203 22:08:23.161807 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e406d9e6-5a62-411f-b7e7-bca58ce548af" containerName="pruner" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.161820 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e406d9e6-5a62-411f-b7e7-bca58ce548af" containerName="pruner" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.162680 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e406d9e6-5a62-411f-b7e7-bca58ce548af" containerName="pruner" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.163236 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.181638 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.300243 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.300318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.402174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.402304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.402433 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.432028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:23 crc kubenswrapper[4830]: I1203 22:08:23.495051 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:24 crc kubenswrapper[4830]: E1203 22:08:24.267909 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 22:08:24 crc kubenswrapper[4830]: E1203 22:08:24.268335 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xsv6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tbst4_openshift-marketplace(37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:24 crc kubenswrapper[4830]: E1203 22:08:24.269652 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tbst4" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" Dec 03 22:08:25 crc kubenswrapper[4830]: E1203 22:08:25.872944 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tbst4" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" Dec 03 22:08:25 crc kubenswrapper[4830]: E1203 22:08:25.948828 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 22:08:25 crc kubenswrapper[4830]: E1203 22:08:25.949232 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25g85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mcjzb_openshift-marketplace(39a2a67d-4e6e-4514-9304-966057dd71bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:25 crc kubenswrapper[4830]: E1203 22:08:25.950441 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mcjzb" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" Dec 03 22:08:26 crc kubenswrapper[4830]: I1203 22:08:26.681568 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:08:26 crc kubenswrapper[4830]: I1203 22:08:26.681632 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.226596 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mcjzb" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.301881 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.302073 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd4rw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p8x7k_openshift-marketplace(d62dec8b-fae8-4022-bfb0-485be07c4700): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.303265 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p8x7k" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.316437 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.316603 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5xlvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9k72v_openshift-marketplace(7ae91290-b6c1-4aab-a373-dec2848c94db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.317709 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9k72v" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.339032 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.339178 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhdqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gxd8v_openshift-marketplace(d80631b4-3fa5-491b-b330-80f733c3b0a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:27 crc kubenswrapper[4830]: E1203 22:08:27.340482 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gxd8v" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.159476 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.160758 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.177234 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p8x7k" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.178640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.254122 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.254313 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5brgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9gkfj_openshift-marketplace(f30f50ee-2d3f-4d4d-9846-c9b916c42375): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.254483 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.255779 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9gkfj" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.256020 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4b65s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j7zzm_openshift-marketplace(876ff782-f899-41ad-801d-52d31854b34c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.257169 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j7zzm" podUID="876ff782-f899-41ad-801d-52d31854b34c" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.261969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.262157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.262223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.301662 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.301821 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrlmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nvnwt_openshift-marketplace(d3ffcb90-9016-4c43-8b6c-9452e9cf6e24): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.303640 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nvnwt" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.326695 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9gkfj" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.327132 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nvnwt" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.335893 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gxd8v" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.336326 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j7zzm" podUID="876ff782-f899-41ad-801d-52d31854b34c" Dec 03 22:08:28 crc kubenswrapper[4830]: E1203 22:08:28.336371 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9k72v" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.368471 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.368557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.368622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.368873 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.368932 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.398502 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access\") pod \"installer-9-crc\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.540315 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.648457 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.648753 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.663371 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zlcmr"] Dec 03 22:08:28 crc kubenswrapper[4830]: W1203 22:08:28.669846 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5ece324a_613f_44ac_9c33_2b06873c1d22.slice/crio-13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c WatchSource:0}: Error finding container 13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c: Status 404 returned error can't find the container with id 13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c Dec 03 22:08:28 crc kubenswrapper[4830]: W1203 22:08:28.680290 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211b6f37_bd3f_475e_b4d9_e3d94ae07c52.slice/crio-d2704899408619154c1bfbf14983a5df39822a51084f83a35d5bdea05bdcbbb7 WatchSource:0}: Error finding container d2704899408619154c1bfbf14983a5df39822a51084f83a35d5bdea05bdcbbb7: Status 404 returned error can't find the container with id d2704899408619154c1bfbf14983a5df39822a51084f83a35d5bdea05bdcbbb7 Dec 03 22:08:28 crc kubenswrapper[4830]: I1203 22:08:28.940475 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.331984 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5ece324a-613f-44ac-9c33-2b06873c1d22","Type":"ContainerStarted","Data":"583060b319630060003634c9d14af1bc56c70f8ca1c9a7ecbca9d3eeeb50230b"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.332311 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5ece324a-613f-44ac-9c33-2b06873c1d22","Type":"ContainerStarted","Data":"13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348731 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" event={"ID":"211b6f37-bd3f-475e-b4d9-e3d94ae07c52","Type":"ContainerStarted","Data":"3b7f506b8a7955ac8b6847da0a4ec845566e7d197130f2ab30503c95d49734ca"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348777 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" event={"ID":"211b6f37-bd3f-475e-b4d9-e3d94ae07c52","Type":"ContainerStarted","Data":"570f651c18be5fcfdcd10b4aa59c79c282397d2a90cada11e55daabc3de89ad9"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348793 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zlcmr" event={"ID":"211b6f37-bd3f-475e-b4d9-e3d94ae07c52","Type":"ContainerStarted","Data":"d2704899408619154c1bfbf14983a5df39822a51084f83a35d5bdea05bdcbbb7"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348806 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e85574f9-5751-45ae-a62f-5c8cce45e669","Type":"ContainerStarted","Data":"25bb64b07e6f934eeb490c8f9b2c7df6f5e740844af90f1e038b686579fbed47"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348818 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a9ff58c2-436e-44e9-b752-6d2fd550ebb4","Type":"ContainerStarted","Data":"1ebbb9ad7f7946de6b09d8d8fb69c1410283ab1d15e298995c44e4903bd88eda"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.348832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a9ff58c2-436e-44e9-b752-6d2fd550ebb4","Type":"ContainerStarted","Data":"b013eefae0868c299ceb7b6c7e66b4d8cd0fb641d70f77462fc666d76506d538"} Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.355494 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=43.355477751 podStartE2EDuration="43.355477751s" podCreationTimestamp="2025-12-03 22:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:29.349144548 +0000 UTC m=+198.345605917" watchObservedRunningTime="2025-12-03 22:08:29.355477751 +0000 UTC m=+198.351939120" Dec 03 22:08:29 crc kubenswrapper[4830]: I1203 22:08:29.381292 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.381275459 podStartE2EDuration="6.381275459s" podCreationTimestamp="2025-12-03 22:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:29.368370215 +0000 UTC m=+198.364831564" watchObservedRunningTime="2025-12-03 22:08:29.381275459 +0000 UTC m=+198.377736808" Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.347584 4830 generic.go:334] "Generic (PLEG): container finished" podID="a9ff58c2-436e-44e9-b752-6d2fd550ebb4" containerID="1ebbb9ad7f7946de6b09d8d8fb69c1410283ab1d15e298995c44e4903bd88eda" exitCode=0 Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.347957 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a9ff58c2-436e-44e9-b752-6d2fd550ebb4","Type":"ContainerDied","Data":"1ebbb9ad7f7946de6b09d8d8fb69c1410283ab1d15e298995c44e4903bd88eda"} Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.351396 4830 generic.go:334] "Generic (PLEG): container finished" podID="5ece324a-613f-44ac-9c33-2b06873c1d22" containerID="583060b319630060003634c9d14af1bc56c70f8ca1c9a7ecbca9d3eeeb50230b" exitCode=0 Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.351930 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5ece324a-613f-44ac-9c33-2b06873c1d22","Type":"ContainerDied","Data":"583060b319630060003634c9d14af1bc56c70f8ca1c9a7ecbca9d3eeeb50230b"} Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.353340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e85574f9-5751-45ae-a62f-5c8cce45e669","Type":"ContainerStarted","Data":"24937d8ac003416a7379dda08bee7d94b57af4d90fddd93811e83ac74e07802e"} Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.363600 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zlcmr" podStartSLOduration=180.363573531 podStartE2EDuration="3m0.363573531s" podCreationTimestamp="2025-12-03 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:29.389679559 +0000 UTC m=+198.386140908" watchObservedRunningTime="2025-12-03 22:08:30.363573531 +0000 UTC m=+199.360034890" Dec 03 22:08:30 crc kubenswrapper[4830]: I1203 22:08:30.410045 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.410029444 podStartE2EDuration="2.410029444s" podCreationTimestamp="2025-12-03 22:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:30.409289123 +0000 UTC m=+199.405750502" watchObservedRunningTime="2025-12-03 22:08:30.410029444 +0000 UTC m=+199.406490793" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.692254 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.694908 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.725000 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir\") pod \"5ece324a-613f-44ac-9c33-2b06873c1d22\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.725129 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir\") pod \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.725198 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access\") pod \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\" (UID: \"a9ff58c2-436e-44e9-b752-6d2fd550ebb4\") " Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.725470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access\") pod \"5ece324a-613f-44ac-9c33-2b06873c1d22\" (UID: \"5ece324a-613f-44ac-9c33-2b06873c1d22\") " Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.726086 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a9ff58c2-436e-44e9-b752-6d2fd550ebb4" (UID: "a9ff58c2-436e-44e9-b752-6d2fd550ebb4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.726179 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5ece324a-613f-44ac-9c33-2b06873c1d22" (UID: "5ece324a-613f-44ac-9c33-2b06873c1d22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.726203 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.733857 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a9ff58c2-436e-44e9-b752-6d2fd550ebb4" (UID: "a9ff58c2-436e-44e9-b752-6d2fd550ebb4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.737818 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5ece324a-613f-44ac-9c33-2b06873c1d22" (UID: "5ece324a-613f-44ac-9c33-2b06873c1d22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.826706 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ece324a-613f-44ac-9c33-2b06873c1d22-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.826750 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff58c2-436e-44e9-b752-6d2fd550ebb4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:31 crc kubenswrapper[4830]: I1203 22:08:31.826764 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ece324a-613f-44ac-9c33-2b06873c1d22-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.368688 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5ece324a-613f-44ac-9c33-2b06873c1d22","Type":"ContainerDied","Data":"13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c"} Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.368742 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.368746 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13a34163b4da97b1ec2c131aecaaaa3174e2a7312832aee402bf0773e1809b2c" Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.370420 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a9ff58c2-436e-44e9-b752-6d2fd550ebb4","Type":"ContainerDied","Data":"b013eefae0868c299ceb7b6c7e66b4d8cd0fb641d70f77462fc666d76506d538"} Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.370462 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b013eefae0868c299ceb7b6c7e66b4d8cd0fb641d70f77462fc666d76506d538" Dec 03 22:08:32 crc kubenswrapper[4830]: I1203 22:08:32.370486 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.418028 4830 generic.go:334] "Generic (PLEG): container finished" podID="876ff782-f899-41ad-801d-52d31854b34c" containerID="901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf" exitCode=0 Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.418106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerDied","Data":"901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf"} Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.423194 4830 generic.go:334] "Generic (PLEG): container finished" podID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerID="c5dcbe50d02a6b8068207cfb5865d0748f2426b49a4f2653712e93d272f1b6ed" exitCode=0 Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.423278 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerDied","Data":"c5dcbe50d02a6b8068207cfb5865d0748f2426b49a4f2653712e93d272f1b6ed"} Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.426237 4830 generic.go:334] "Generic (PLEG): container finished" podID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerID="690767b014fb1e802d006401ca1e3e22347936aef4943d2ba1d4313f9d7eeb68" exitCode=0 Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.426298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerDied","Data":"690767b014fb1e802d006401ca1e3e22347936aef4943d2ba1d4313f9d7eeb68"} Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.437048 4830 generic.go:334] "Generic (PLEG): container finished" podID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerID="b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426" exitCode=0 Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.437182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerDied","Data":"b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426"} Dec 03 22:08:41 crc kubenswrapper[4830]: I1203 22:08:41.459462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerStarted","Data":"76a7a8cceadf2ebcf75e16ad45a7709dade11b50cea8bff8144f3ce78534ae3b"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.466945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerStarted","Data":"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.470126 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerStarted","Data":"0fad731e2872f42025b449c90172b064478bde6e9307089a768e2d4d26fff2da"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.472575 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerStarted","Data":"5206ba578a37460c928b7aff61d1b70217adb86f48fccba5ecce0c2625f5d210"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.474790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerStarted","Data":"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.477827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerStarted","Data":"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.479742 4830 generic.go:334] "Generic (PLEG): container finished" podID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerID="76a7a8cceadf2ebcf75e16ad45a7709dade11b50cea8bff8144f3ce78534ae3b" exitCode=0 Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.479781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerDied","Data":"76a7a8cceadf2ebcf75e16ad45a7709dade11b50cea8bff8144f3ce78534ae3b"} Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.492957 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j7zzm" podStartSLOduration=4.078200433 podStartE2EDuration="1m5.49293521s" podCreationTimestamp="2025-12-03 22:07:37 +0000 UTC" firstStartedPulling="2025-12-03 22:07:40.610950825 +0000 UTC m=+149.607412174" lastFinishedPulling="2025-12-03 22:08:42.025685602 +0000 UTC m=+211.022146951" observedRunningTime="2025-12-03 22:08:42.491207762 +0000 UTC m=+211.487669111" watchObservedRunningTime="2025-12-03 22:08:42.49293521 +0000 UTC m=+211.489396559" Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.535661 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mcjzb" podStartSLOduration=5.282203212 podStartE2EDuration="1m7.535644619s" podCreationTimestamp="2025-12-03 22:07:35 +0000 UTC" firstStartedPulling="2025-12-03 22:07:39.633058373 +0000 UTC m=+148.629519722" lastFinishedPulling="2025-12-03 22:08:41.88649978 +0000 UTC m=+210.882961129" observedRunningTime="2025-12-03 22:08:42.534269322 +0000 UTC m=+211.530730681" watchObservedRunningTime="2025-12-03 22:08:42.535644619 +0000 UTC m=+211.532105968" Dec 03 22:08:42 crc kubenswrapper[4830]: I1203 22:08:42.590278 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9gkfj" podStartSLOduration=3.401583423 podStartE2EDuration="1m4.590253805s" podCreationTimestamp="2025-12-03 22:07:38 +0000 UTC" firstStartedPulling="2025-12-03 22:07:40.617761356 +0000 UTC m=+149.614222705" lastFinishedPulling="2025-12-03 22:08:41.806431728 +0000 UTC m=+210.802893087" observedRunningTime="2025-12-03 22:08:42.586901983 +0000 UTC m=+211.583363332" watchObservedRunningTime="2025-12-03 22:08:42.590253805 +0000 UTC m=+211.586715154" Dec 03 22:08:43 crc kubenswrapper[4830]: I1203 22:08:43.364614 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p8x7k" podStartSLOduration=5.125412156 podStartE2EDuration="1m7.364597222s" podCreationTimestamp="2025-12-03 22:07:36 +0000 UTC" firstStartedPulling="2025-12-03 22:07:39.633108715 +0000 UTC m=+148.629570064" lastFinishedPulling="2025-12-03 22:08:41.872293781 +0000 UTC m=+210.868755130" observedRunningTime="2025-12-03 22:08:42.612109083 +0000 UTC m=+211.608570442" watchObservedRunningTime="2025-12-03 22:08:43.364597222 +0000 UTC m=+212.361058571" Dec 03 22:08:43 crc kubenswrapper[4830]: I1203 22:08:43.487697 4830 generic.go:334] "Generic (PLEG): container finished" podID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerID="5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97" exitCode=0 Dec 03 22:08:43 crc kubenswrapper[4830]: I1203 22:08:43.487756 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerDied","Data":"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97"} Dec 03 22:08:43 crc kubenswrapper[4830]: I1203 22:08:43.494272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerStarted","Data":"af3f0ff12188e8dc4cee1702fe86214af37618a717ca35e20a2f90fef893e716"} Dec 03 22:08:43 crc kubenswrapper[4830]: I1203 22:08:43.525454 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbst4" podStartSLOduration=2.314387173 podStartE2EDuration="1m4.525434248s" podCreationTimestamp="2025-12-03 22:07:39 +0000 UTC" firstStartedPulling="2025-12-03 22:07:40.668692388 +0000 UTC m=+149.665153737" lastFinishedPulling="2025-12-03 22:08:42.879739463 +0000 UTC m=+211.876200812" observedRunningTime="2025-12-03 22:08:43.524901033 +0000 UTC m=+212.521362382" watchObservedRunningTime="2025-12-03 22:08:43.525434248 +0000 UTC m=+212.521895597" Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.500218 4830 generic.go:334] "Generic (PLEG): container finished" podID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerID="a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856" exitCode=0 Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.500295 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerDied","Data":"a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856"} Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.503037 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerStarted","Data":"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db"} Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.504833 4830 generic.go:334] "Generic (PLEG): container finished" podID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerID="d6f38ab874c577631f49ddf7aef2bee9fff1e8257a2b076a9b0aa10a197a5d84" exitCode=0 Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.504880 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerDied","Data":"d6f38ab874c577631f49ddf7aef2bee9fff1e8257a2b076a9b0aa10a197a5d84"} Dec 03 22:08:44 crc kubenswrapper[4830]: I1203 22:08:44.554150 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nvnwt" podStartSLOduration=3.319899191 podStartE2EDuration="1m6.554132112s" podCreationTimestamp="2025-12-03 22:07:38 +0000 UTC" firstStartedPulling="2025-12-03 22:07:40.664799939 +0000 UTC m=+149.661261288" lastFinishedPulling="2025-12-03 22:08:43.89903286 +0000 UTC m=+212.895494209" observedRunningTime="2025-12-03 22:08:44.553179915 +0000 UTC m=+213.549641274" watchObservedRunningTime="2025-12-03 22:08:44.554132112 +0000 UTC m=+213.550593461" Dec 03 22:08:45 crc kubenswrapper[4830]: I1203 22:08:45.513254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerStarted","Data":"11a3b60a964c5a92842579e0a8d61a67b81c19ea3e1890a6b329a66aebfb46f5"} Dec 03 22:08:45 crc kubenswrapper[4830]: I1203 22:08:45.515303 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerStarted","Data":"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816"} Dec 03 22:08:45 crc kubenswrapper[4830]: I1203 22:08:45.538042 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9k72v" podStartSLOduration=4.252739995 podStartE2EDuration="1m9.538017118s" podCreationTimestamp="2025-12-03 22:07:36 +0000 UTC" firstStartedPulling="2025-12-03 22:07:39.632971941 +0000 UTC m=+148.629433290" lastFinishedPulling="2025-12-03 22:08:44.918249064 +0000 UTC m=+213.914710413" observedRunningTime="2025-12-03 22:08:45.536769413 +0000 UTC m=+214.533230762" watchObservedRunningTime="2025-12-03 22:08:45.538017118 +0000 UTC m=+214.534478467" Dec 03 22:08:45 crc kubenswrapper[4830]: I1203 22:08:45.558265 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gxd8v" podStartSLOduration=4.132954534 podStartE2EDuration="1m10.558242002s" podCreationTimestamp="2025-12-03 22:07:35 +0000 UTC" firstStartedPulling="2025-12-03 22:07:38.46315619 +0000 UTC m=+147.459617529" lastFinishedPulling="2025-12-03 22:08:44.888443638 +0000 UTC m=+213.884904997" observedRunningTime="2025-12-03 22:08:45.557324557 +0000 UTC m=+214.553785916" watchObservedRunningTime="2025-12-03 22:08:45.558242002 +0000 UTC m=+214.554703351" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.577923 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.577981 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.602974 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.603041 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.639552 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.639609 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.983465 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:08:46 crc kubenswrapper[4830]: I1203 22:08:46.983826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.478427 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.478492 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.748482 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.748734 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.749026 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.749656 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.751007 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.768304 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.774015 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.800676 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.801303 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.815108 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:48 crc kubenswrapper[4830]: I1203 22:08:48.821664 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.325950 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.326005 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.610796 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.705689 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.705783 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:49 crc kubenswrapper[4830]: I1203 22:08:49.763256 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:50 crc kubenswrapper[4830]: I1203 22:08:50.379226 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nvnwt" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="registry-server" probeResult="failure" output=< Dec 03 22:08:50 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 22:08:50 crc kubenswrapper[4830]: > Dec 03 22:08:50 crc kubenswrapper[4830]: I1203 22:08:50.622452 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:52 crc kubenswrapper[4830]: I1203 22:08:52.587467 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:08:52 crc kubenswrapper[4830]: I1203 22:08:52.588036 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p8x7k" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="registry-server" containerID="cri-o://0fad731e2872f42025b449c90172b064478bde6e9307089a768e2d4d26fff2da" gracePeriod=2 Dec 03 22:08:52 crc kubenswrapper[4830]: I1203 22:08:52.790203 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:08:52 crc kubenswrapper[4830]: I1203 22:08:52.790690 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9gkfj" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="registry-server" containerID="cri-o://5206ba578a37460c928b7aff61d1b70217adb86f48fccba5ecce0c2625f5d210" gracePeriod=2 Dec 03 22:08:54 crc kubenswrapper[4830]: I1203 22:08:54.581101 4830 generic.go:334] "Generic (PLEG): container finished" podID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerID="5206ba578a37460c928b7aff61d1b70217adb86f48fccba5ecce0c2625f5d210" exitCode=0 Dec 03 22:08:54 crc kubenswrapper[4830]: I1203 22:08:54.581177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerDied","Data":"5206ba578a37460c928b7aff61d1b70217adb86f48fccba5ecce0c2625f5d210"} Dec 03 22:08:54 crc kubenswrapper[4830]: I1203 22:08:54.585701 4830 generic.go:334] "Generic (PLEG): container finished" podID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerID="0fad731e2872f42025b449c90172b064478bde6e9307089a768e2d4d26fff2da" exitCode=0 Dec 03 22:08:54 crc kubenswrapper[4830]: I1203 22:08:54.585744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerDied","Data":"0fad731e2872f42025b449c90172b064478bde6e9307089a768e2d4d26fff2da"} Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.004293 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.004790 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbst4" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="registry-server" containerID="cri-o://af3f0ff12188e8dc4cee1702fe86214af37618a717ca35e20a2f90fef893e716" gracePeriod=2 Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.189856 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.253068 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content\") pod \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.253208 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5brgk\" (UniqueName: \"kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk\") pod \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.253345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities\") pod \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\" (UID: \"f30f50ee-2d3f-4d4d-9846-c9b916c42375\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.254469 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities" (OuterVolumeSpecName: "utilities") pod "f30f50ee-2d3f-4d4d-9846-c9b916c42375" (UID: "f30f50ee-2d3f-4d4d-9846-c9b916c42375"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.277589 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk" (OuterVolumeSpecName: "kube-api-access-5brgk") pod "f30f50ee-2d3f-4d4d-9846-c9b916c42375" (UID: "f30f50ee-2d3f-4d4d-9846-c9b916c42375"). InnerVolumeSpecName "kube-api-access-5brgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.354828 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5brgk\" (UniqueName: \"kubernetes.io/projected/f30f50ee-2d3f-4d4d-9846-c9b916c42375-kube-api-access-5brgk\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.354884 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.359209 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.521206 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f30f50ee-2d3f-4d4d-9846-c9b916c42375" (UID: "f30f50ee-2d3f-4d4d-9846-c9b916c42375"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.559135 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities\") pod \"d62dec8b-fae8-4022-bfb0-485be07c4700\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.559236 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd4rw\" (UniqueName: \"kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw\") pod \"d62dec8b-fae8-4022-bfb0-485be07c4700\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.559326 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content\") pod \"d62dec8b-fae8-4022-bfb0-485be07c4700\" (UID: \"d62dec8b-fae8-4022-bfb0-485be07c4700\") " Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.559597 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f30f50ee-2d3f-4d4d-9846-c9b916c42375-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.559891 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities" (OuterVolumeSpecName: "utilities") pod "d62dec8b-fae8-4022-bfb0-485be07c4700" (UID: "d62dec8b-fae8-4022-bfb0-485be07c4700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.563291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw" (OuterVolumeSpecName: "kube-api-access-zd4rw") pod "d62dec8b-fae8-4022-bfb0-485be07c4700" (UID: "d62dec8b-fae8-4022-bfb0-485be07c4700"). InnerVolumeSpecName "kube-api-access-zd4rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.594044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8x7k" event={"ID":"d62dec8b-fae8-4022-bfb0-485be07c4700","Type":"ContainerDied","Data":"34f6cbad054928f62c1cb5dabceaee5faef4c3076a44347c3324f8123e66e991"} Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.594103 4830 scope.go:117] "RemoveContainer" containerID="0fad731e2872f42025b449c90172b064478bde6e9307089a768e2d4d26fff2da" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.594181 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8x7k" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.596346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gkfj" event={"ID":"f30f50ee-2d3f-4d4d-9846-c9b916c42375","Type":"ContainerDied","Data":"af98a231067e10734fb0ad659358a9f12c1f4b2669ce405c8ba541d65c3dd5e3"} Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.596471 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gkfj" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.621437 4830 scope.go:117] "RemoveContainer" containerID="c5dcbe50d02a6b8068207cfb5865d0748f2426b49a4f2653712e93d272f1b6ed" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.630560 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.635326 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gkfj"] Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.642035 4830 scope.go:117] "RemoveContainer" containerID="e753d5689faaa14eb1e96d006f9d07d461431388a9463468531d9678a4f40a56" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.653853 4830 scope.go:117] "RemoveContainer" containerID="5206ba578a37460c928b7aff61d1b70217adb86f48fccba5ecce0c2625f5d210" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.661399 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.661617 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd4rw\" (UniqueName: \"kubernetes.io/projected/d62dec8b-fae8-4022-bfb0-485be07c4700-kube-api-access-zd4rw\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.666212 4830 scope.go:117] "RemoveContainer" containerID="690767b014fb1e802d006401ca1e3e22347936aef4943d2ba1d4313f9d7eeb68" Dec 03 22:08:55 crc kubenswrapper[4830]: I1203 22:08:55.685295 4830 scope.go:117] "RemoveContainer" containerID="396ff73a9172c6d5b3c0f01b72b3cdca74d52590be92d7f5c37e3a24fcc1d9e6" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.612539 4830 generic.go:334] "Generic (PLEG): container finished" podID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerID="af3f0ff12188e8dc4cee1702fe86214af37618a717ca35e20a2f90fef893e716" exitCode=0 Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.612697 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerDied","Data":"af3f0ff12188e8dc4cee1702fe86214af37618a717ca35e20a2f90fef893e716"} Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.652053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d62dec8b-fae8-4022-bfb0-485be07c4700" (UID: "d62dec8b-fae8-4022-bfb0-485be07c4700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.662009 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.677104 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62dec8b-fae8-4022-bfb0-485be07c4700-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.680774 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.680830 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.680879 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.681621 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.681749 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49" gracePeriod=600 Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.837181 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:08:56 crc kubenswrapper[4830]: I1203 22:08:56.840742 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p8x7k"] Dec 03 22:08:57 crc kubenswrapper[4830]: I1203 22:08:57.033358 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:08:57 crc kubenswrapper[4830]: I1203 22:08:57.346814 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" path="/var/lib/kubelet/pods/d62dec8b-fae8-4022-bfb0-485be07c4700/volumes" Dec 03 22:08:57 crc kubenswrapper[4830]: I1203 22:08:57.348197 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" path="/var/lib/kubelet/pods/f30f50ee-2d3f-4d4d-9846-c9b916c42375/volumes" Dec 03 22:08:57 crc kubenswrapper[4830]: I1203 22:08:57.922235 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.096584 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities\") pod \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.096786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsv6v\" (UniqueName: \"kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v\") pod \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.096829 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content\") pod \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\" (UID: \"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a\") " Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.098750 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities" (OuterVolumeSpecName: "utilities") pod "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" (UID: "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.105248 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v" (OuterVolumeSpecName: "kube-api-access-xsv6v") pod "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" (UID: "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a"). InnerVolumeSpecName "kube-api-access-xsv6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.198293 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.198374 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsv6v\" (UniqueName: \"kubernetes.io/projected/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-kube-api-access-xsv6v\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.276001 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" (UID: "37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.299161 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.629388 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbst4" event={"ID":"37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a","Type":"ContainerDied","Data":"b9ab4e2e0acac7f02bf97410039c6503787c8266ddcf4ddc885912699c33c0f8"} Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.629532 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbst4" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.629899 4830 scope.go:117] "RemoveContainer" containerID="af3f0ff12188e8dc4cee1702fe86214af37618a717ca35e20a2f90fef893e716" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.632134 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49" exitCode=0 Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.632162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49"} Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.632179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6"} Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.651900 4830 scope.go:117] "RemoveContainer" containerID="76a7a8cceadf2ebcf75e16ad45a7709dade11b50cea8bff8144f3ce78534ae3b" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.685880 4830 scope.go:117] "RemoveContainer" containerID="fd5f2b3c4611ac9d79b92cc44d339a9848406d46362dabd718d7f8ec753cf731" Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.689044 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:08:58 crc kubenswrapper[4830]: I1203 22:08:58.696666 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbst4"] Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.194751 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.195178 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9k72v" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="registry-server" containerID="cri-o://11a3b60a964c5a92842579e0a8d61a67b81c19ea3e1890a6b329a66aebfb46f5" gracePeriod=2 Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.346634 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" path="/var/lib/kubelet/pods/37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a/volumes" Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.375418 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.434145 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.641047 4830 generic.go:334] "Generic (PLEG): container finished" podID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerID="11a3b60a964c5a92842579e0a8d61a67b81c19ea3e1890a6b329a66aebfb46f5" exitCode=0 Dec 03 22:08:59 crc kubenswrapper[4830]: I1203 22:08:59.641130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerDied","Data":"11a3b60a964c5a92842579e0a8d61a67b81c19ea3e1890a6b329a66aebfb46f5"} Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.114270 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.228545 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xlvh\" (UniqueName: \"kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh\") pod \"7ae91290-b6c1-4aab-a373-dec2848c94db\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.228629 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities\") pod \"7ae91290-b6c1-4aab-a373-dec2848c94db\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.228659 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content\") pod \"7ae91290-b6c1-4aab-a373-dec2848c94db\" (UID: \"7ae91290-b6c1-4aab-a373-dec2848c94db\") " Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.229618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities" (OuterVolumeSpecName: "utilities") pod "7ae91290-b6c1-4aab-a373-dec2848c94db" (UID: "7ae91290-b6c1-4aab-a373-dec2848c94db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.234462 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh" (OuterVolumeSpecName: "kube-api-access-5xlvh") pod "7ae91290-b6c1-4aab-a373-dec2848c94db" (UID: "7ae91290-b6c1-4aab-a373-dec2848c94db"). InnerVolumeSpecName "kube-api-access-5xlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.278480 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ae91290-b6c1-4aab-a373-dec2848c94db" (UID: "7ae91290-b6c1-4aab-a373-dec2848c94db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.331905 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xlvh\" (UniqueName: \"kubernetes.io/projected/7ae91290-b6c1-4aab-a373-dec2848c94db-kube-api-access-5xlvh\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.331941 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.331953 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae91290-b6c1-4aab-a373-dec2848c94db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.654488 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9k72v" event={"ID":"7ae91290-b6c1-4aab-a373-dec2848c94db","Type":"ContainerDied","Data":"a31e9a60f12770909772f0d5ab8aa9e83d8e867630c7a1aaca3cdcbd3a0a6e39"} Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.654619 4830 scope.go:117] "RemoveContainer" containerID="11a3b60a964c5a92842579e0a8d61a67b81c19ea3e1890a6b329a66aebfb46f5" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.654644 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9k72v" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.683364 4830 scope.go:117] "RemoveContainer" containerID="d6f38ab874c577631f49ddf7aef2bee9fff1e8257a2b076a9b0aa10a197a5d84" Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.697631 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.712369 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9k72v"] Dec 03 22:09:00 crc kubenswrapper[4830]: I1203 22:09:00.750250 4830 scope.go:117] "RemoveContainer" containerID="91bfdac2608ec8c9fbbf9a57dbe3f434760e2a0270d4e90bbc9214ce5268fdae" Dec 03 22:09:01 crc kubenswrapper[4830]: I1203 22:09:01.348757 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" path="/var/lib/kubelet/pods/7ae91290-b6c1-4aab-a373-dec2848c94db/volumes" Dec 03 22:09:01 crc kubenswrapper[4830]: I1203 22:09:01.386857 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lz46c"] Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.121165 4830 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124181 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124656 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124690 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124728 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124743 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124755 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124773 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ff58c2-436e-44e9-b752-6d2fd550ebb4" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124785 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ff58c2-436e-44e9-b752-6d2fd550ebb4" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124800 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124812 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124833 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124845 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124863 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124875 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124895 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ece324a-613f-44ac-9c33-2b06873c1d22" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124907 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ece324a-613f-44ac-9c33-2b06873c1d22" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124926 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124937 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124956 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124968 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.124985 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.124996 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="extract-content" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.125011 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125022 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.125036 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125047 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.125063 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125075 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="extract-utilities" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125240 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae91290-b6c1-4aab-a373-dec2848c94db" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125258 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ff58c2-436e-44e9-b752-6d2fd550ebb4" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125274 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ece324a-613f-44ac-9c33-2b06873c1d22" containerName="pruner" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125296 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30f50ee-2d3f-4d4d-9846-c9b916c42375" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125311 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c2b0de-36ff-4955-9ed8-a2c5a7c6c82a" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125331 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62dec8b-fae8-4022-bfb0-485be07c4700" containerName="registry-server" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.125974 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.170991 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.216906 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.217319 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170" gracePeriod=15 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.217493 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6" gracePeriod=15 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.217593 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275" gracePeriod=15 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.217653 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27" gracePeriod=15 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.217764 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6" gracePeriod=15 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.219383 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.219749 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.219771 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.219803 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.219818 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.219838 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.220957 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.221006 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221027 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.221051 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221069 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.221097 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221114 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221365 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221391 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221410 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221430 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221445 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221463 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.221673 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.221691 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.236970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.237054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.237152 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.237330 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.237378 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338730 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338796 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338862 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.338994 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.339153 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.339252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.339341 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.339417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.339482 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.441626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.442117 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.442178 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.442248 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.441918 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.442623 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.465184 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:07 crc kubenswrapper[4830]: W1203 22:09:07.482487 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-83d9cd720528c7b894b0693843e8e0704247cf0f3582c5784d7cc8f3c9f96812 WatchSource:0}: Error finding container 83d9cd720528c7b894b0693843e8e0704247cf0f3582c5784d7cc8f3c9f96812: Status 404 returned error can't find the container with id 83d9cd720528c7b894b0693843e8e0704247cf0f3582c5784d7cc8f3c9f96812 Dec 03 22:09:07 crc kubenswrapper[4830]: E1203 22:09:07.486286 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.217:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dd4052091acfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 22:09:07.485445371 +0000 UTC m=+236.481906720,LastTimestamp:2025-12-03 22:09:07.485445371 +0000 UTC m=+236.481906720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.702869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"83d9cd720528c7b894b0693843e8e0704247cf0f3582c5784d7cc8f3c9f96812"} Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.708037 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.710786 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.711791 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6" exitCode=0 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.711811 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6" exitCode=0 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.711818 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275" exitCode=0 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.711825 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27" exitCode=2 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.711842 4830 scope.go:117] "RemoveContainer" containerID="90b68a61e69e38ddfe0836f99c9809bb10212b166468e9797c5dec9d3e7ebe0d" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.714451 4830 generic.go:334] "Generic (PLEG): container finished" podID="e85574f9-5751-45ae-a62f-5c8cce45e669" containerID="24937d8ac003416a7379dda08bee7d94b57af4d90fddd93811e83ac74e07802e" exitCode=0 Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.714501 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e85574f9-5751-45ae-a62f-5c8cce45e669","Type":"ContainerDied","Data":"24937d8ac003416a7379dda08bee7d94b57af4d90fddd93811e83ac74e07802e"} Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.715140 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:07 crc kubenswrapper[4830]: I1203 22:09:07.715392 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:08 crc kubenswrapper[4830]: I1203 22:09:08.723629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ba8c3b6676b7d9b0a35863469fdf6d9894e269f42a34b6ac89533936345ab8df"} Dec 03 22:09:08 crc kubenswrapper[4830]: I1203 22:09:08.725847 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:08 crc kubenswrapper[4830]: I1203 22:09:08.726876 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:08 crc kubenswrapper[4830]: I1203 22:09:08.728728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.099562 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.100449 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.101062 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.268994 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir\") pod \"e85574f9-5751-45ae-a62f-5c8cce45e669\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269409 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock\") pod \"e85574f9-5751-45ae-a62f-5c8cce45e669\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269565 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access\") pod \"e85574f9-5751-45ae-a62f-5c8cce45e669\" (UID: \"e85574f9-5751-45ae-a62f-5c8cce45e669\") " Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269689 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e85574f9-5751-45ae-a62f-5c8cce45e669" (UID: "e85574f9-5751-45ae-a62f-5c8cce45e669"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269732 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock" (OuterVolumeSpecName: "var-lock") pod "e85574f9-5751-45ae-a62f-5c8cce45e669" (UID: "e85574f9-5751-45ae-a62f-5c8cce45e669"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269975 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.269999 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e85574f9-5751-45ae-a62f-5c8cce45e669-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.286316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e85574f9-5751-45ae-a62f-5c8cce45e669" (UID: "e85574f9-5751-45ae-a62f-5c8cce45e669"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.371383 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e85574f9-5751-45ae-a62f-5c8cce45e669-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.746853 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e85574f9-5751-45ae-a62f-5c8cce45e669","Type":"ContainerDied","Data":"25bb64b07e6f934eeb490c8f9b2c7df6f5e740844af90f1e038b686579fbed47"} Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.746931 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25bb64b07e6f934eeb490c8f9b2c7df6f5e740844af90f1e038b686579fbed47" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.747297 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.753225 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:09 crc kubenswrapper[4830]: I1203 22:09:09.754018 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.051673 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.052315 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.053074 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.053643 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.054202 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.054254 4830 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.054721 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="200ms" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.255709 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="400ms" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.304898 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.306223 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.306941 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.307388 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.308734 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.487761 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.487808 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.487819 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.487889 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.487930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.488067 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.488397 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.488417 4830 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.488426 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.658092 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="800ms" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.758833 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.760172 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170" exitCode=0 Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.760247 4830 scope.go:117] "RemoveContainer" containerID="d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.760306 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.789265 4830 scope.go:117] "RemoveContainer" containerID="66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.789659 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.790094 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.790464 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.811541 4830 scope.go:117] "RemoveContainer" containerID="005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.829895 4830 scope.go:117] "RemoveContainer" containerID="8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.844093 4830 scope.go:117] "RemoveContainer" containerID="8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.865336 4830 scope.go:117] "RemoveContainer" containerID="b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.889983 4830 scope.go:117] "RemoveContainer" containerID="d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.890659 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\": container with ID starting with d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6 not found: ID does not exist" containerID="d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.890720 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6"} err="failed to get container status \"d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\": rpc error: code = NotFound desc = could not find container \"d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6\": container with ID starting with d163fd0597f09e9f9b727f1d07a01252460ca2915a5e0f47dcdc942aca34e8b6 not found: ID does not exist" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.890765 4830 scope.go:117] "RemoveContainer" containerID="66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.891159 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\": container with ID starting with 66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6 not found: ID does not exist" containerID="66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.891225 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6"} err="failed to get container status \"66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\": rpc error: code = NotFound desc = could not find container \"66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6\": container with ID starting with 66d796bcd1bad712c37bf0db87c875e11ba57e061881ec26de077d3a10a346b6 not found: ID does not exist" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.891267 4830 scope.go:117] "RemoveContainer" containerID="005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.891646 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\": container with ID starting with 005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275 not found: ID does not exist" containerID="005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.891680 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275"} err="failed to get container status \"005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\": rpc error: code = NotFound desc = could not find container \"005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275\": container with ID starting with 005f1124429891acf6ff05d6b7b099629a6ddab6d4ddfb330232674118908275 not found: ID does not exist" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.891706 4830 scope.go:117] "RemoveContainer" containerID="8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.892836 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\": container with ID starting with 8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27 not found: ID does not exist" containerID="8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.892865 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27"} err="failed to get container status \"8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\": rpc error: code = NotFound desc = could not find container \"8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27\": container with ID starting with 8499dabb698c984da88566a87a774fc562eb9976a35bf0dd3a6f21ff3c869a27 not found: ID does not exist" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.892883 4830 scope.go:117] "RemoveContainer" containerID="8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.893650 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\": container with ID starting with 8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170 not found: ID does not exist" containerID="8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.894005 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170"} err="failed to get container status \"8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\": rpc error: code = NotFound desc = could not find container \"8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170\": container with ID starting with 8ff9e476f853bfc077bd83d9490571b67dce1351bb842d1e8ccb41cd5a127170 not found: ID does not exist" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.894049 4830 scope.go:117] "RemoveContainer" containerID="b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7" Dec 03 22:09:10 crc kubenswrapper[4830]: E1203 22:09:10.894713 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\": container with ID starting with b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7 not found: ID does not exist" containerID="b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7" Dec 03 22:09:10 crc kubenswrapper[4830]: I1203 22:09:10.894754 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7"} err="failed to get container status \"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\": rpc error: code = NotFound desc = could not find container \"b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7\": container with ID starting with b806ec719138c3a1e2f478744937d1d4eb865a3283233656a2aa1a33c88de0b7 not found: ID does not exist" Dec 03 22:09:11 crc kubenswrapper[4830]: I1203 22:09:11.341008 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:11 crc kubenswrapper[4830]: I1203 22:09:11.341752 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:11 crc kubenswrapper[4830]: I1203 22:09:11.342186 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:11 crc kubenswrapper[4830]: I1203 22:09:11.349885 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 22:09:11 crc kubenswrapper[4830]: E1203 22:09:11.422935 4830 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.217:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" volumeName="registry-storage" Dec 03 22:09:11 crc kubenswrapper[4830]: E1203 22:09:11.458712 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="1.6s" Dec 03 22:09:11 crc kubenswrapper[4830]: E1203 22:09:11.876482 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.217:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dd4052091acfb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 22:09:07.485445371 +0000 UTC m=+236.481906720,LastTimestamp:2025-12-03 22:09:07.485445371 +0000 UTC m=+236.481906720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 22:09:13 crc kubenswrapper[4830]: E1203 22:09:13.061731 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="3.2s" Dec 03 22:09:16 crc kubenswrapper[4830]: E1203 22:09:16.264052 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.217:6443: connect: connection refused" interval="6.4s" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.336465 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.337979 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.338427 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.362288 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.362336 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:17 crc kubenswrapper[4830]: E1203 22:09:17.362916 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.363462 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:17 crc kubenswrapper[4830]: W1203 22:09:17.405198 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-776d1496383c3aec5229c1ca9fc6699d5f8b2396ac9891fdcfadf451e4966116 WatchSource:0}: Error finding container 776d1496383c3aec5229c1ca9fc6699d5f8b2396ac9891fdcfadf451e4966116: Status 404 returned error can't find the container with id 776d1496383c3aec5229c1ca9fc6699d5f8b2396ac9891fdcfadf451e4966116 Dec 03 22:09:17 crc kubenswrapper[4830]: I1203 22:09:17.814383 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"776d1496383c3aec5229c1ca9fc6699d5f8b2396ac9891fdcfadf451e4966116"} Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.824967 4830 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c30c3eb84540211eb8fb91c54513500fbbab964c6f187881b93cba4392398afc" exitCode=0 Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.825046 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c30c3eb84540211eb8fb91c54513500fbbab964c6f187881b93cba4392398afc"} Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.825444 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.826436 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.826132 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:18 crc kubenswrapper[4830]: I1203 22:09:18.827122 4830 status_manager.go:851] "Failed to get status for pod" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" Dec 03 22:09:18 crc kubenswrapper[4830]: E1203 22:09:18.827450 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.217:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:19 crc kubenswrapper[4830]: I1203 22:09:19.836088 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc49caed22edaf18838784c2064d7d7a07e521b0b0cafed7113f90edd7e59ac6"} Dec 03 22:09:20 crc kubenswrapper[4830]: I1203 22:09:20.844753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"96322843ec1e787b06ef4c2569ce9195b90a9218b828d62ce76db9112ce04944"} Dec 03 22:09:20 crc kubenswrapper[4830]: I1203 22:09:20.845106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c00401e427ab5dcbcfb0367c9634101c79e4983342cde493498f9510ee6751b1"} Dec 03 22:09:21 crc kubenswrapper[4830]: I1203 22:09:21.854280 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"268d2104f52a0df3caf3132e36134922a6b43748d299506b7ae423329e0c1431"} Dec 03 22:09:21 crc kubenswrapper[4830]: I1203 22:09:21.854684 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe3a5d246212c5e4acd9b600ad3556f7a3967b5df3565d6d57a91ef83a6146d7"} Dec 03 22:09:21 crc kubenswrapper[4830]: I1203 22:09:21.854709 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:21 crc kubenswrapper[4830]: I1203 22:09:21.854537 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:21 crc kubenswrapper[4830]: I1203 22:09:21.854731 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.364101 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.364354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.368636 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.617185 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.617247 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.866296 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.866413 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735" exitCode=1 Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.866469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735"} Dec 03 22:09:22 crc kubenswrapper[4830]: I1203 22:09:22.867380 4830 scope.go:117] "RemoveContainer" containerID="e656c6d3183bfdd755312e630a2754923e6da3189824f7361ea0f557df734735" Dec 03 22:09:23 crc kubenswrapper[4830]: I1203 22:09:23.873183 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 22:09:23 crc kubenswrapper[4830]: I1203 22:09:23.873469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2abd1529c1df7e886791db874814356b26e0e2cdb15b879e0fd869269f3d116"} Dec 03 22:09:25 crc kubenswrapper[4830]: I1203 22:09:25.477800 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.430249 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerName="oauth-openshift" containerID="cri-o://abec3c7aab5040ef27c81bba8b6b8cb63594e64d72337a86748fa59dc16e7c9d" gracePeriod=15 Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.863567 4830 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.889901 4830 generic.go:334] "Generic (PLEG): container finished" podID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerID="abec3c7aab5040ef27c81bba8b6b8cb63594e64d72337a86748fa59dc16e7c9d" exitCode=0 Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.890007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" event={"ID":"ef1fa67c-db0a-4077-92ed-1b55beebf7c6","Type":"ContainerDied","Data":"abec3c7aab5040ef27c81bba8b6b8cb63594e64d72337a86748fa59dc16e7c9d"} Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.890070 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" event={"ID":"ef1fa67c-db0a-4077-92ed-1b55beebf7c6","Type":"ContainerDied","Data":"9d11fb4a92fc2f1328d7834697956457e1eb1bc113839f693fa2311909608ed5"} Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.890092 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d11fb4a92fc2f1328d7834697956457e1eb1bc113839f693fa2311909608ed5" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.890701 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.890734 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.897243 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.905868 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="044c0b71-df5b-4b2c-8530-71b921787557" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.942957 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999724 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999760 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvkbc\" (UniqueName: \"kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999799 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999819 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999838 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:26 crc kubenswrapper[4830]: I1203 22:09:26.999854 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999881 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999897 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999921 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999945 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999962 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:26.999980 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template\") pod \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\" (UID: \"ef1fa67c-db0a-4077-92ed-1b55beebf7c6\") " Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.000167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.000886 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.001633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.003101 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.005039 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.007404 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.009466 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.009803 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc" (OuterVolumeSpecName: "kube-api-access-wvkbc") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "kube-api-access-wvkbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.010850 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.011281 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.011622 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.011867 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.012405 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.013705 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ef1fa67c-db0a-4077-92ed-1b55beebf7c6" (UID: "ef1fa67c-db0a-4077-92ed-1b55beebf7c6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101140 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101180 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvkbc\" (UniqueName: \"kubernetes.io/projected/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-kube-api-access-wvkbc\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101194 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101208 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101221 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101232 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101244 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101258 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101272 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101285 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101297 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101309 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101321 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.101333 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef1fa67c-db0a-4077-92ed-1b55beebf7c6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.753241 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.756714 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.894978 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lz46c" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.895732 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:27 crc kubenswrapper[4830]: I1203 22:09:27.895761 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:31 crc kubenswrapper[4830]: I1203 22:09:31.365154 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="044c0b71-df5b-4b2c-8530-71b921787557" Dec 03 22:09:33 crc kubenswrapper[4830]: I1203 22:09:33.462696 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 22:09:34 crc kubenswrapper[4830]: I1203 22:09:34.773601 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 22:09:35 crc kubenswrapper[4830]: I1203 22:09:35.488792 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 22:09:36 crc kubenswrapper[4830]: I1203 22:09:36.321905 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 22:09:36 crc kubenswrapper[4830]: I1203 22:09:36.639759 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 22:09:36 crc kubenswrapper[4830]: I1203 22:09:36.952068 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 22:09:37 crc kubenswrapper[4830]: I1203 22:09:37.205674 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 22:09:37 crc kubenswrapper[4830]: I1203 22:09:37.654397 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 22:09:37 crc kubenswrapper[4830]: I1203 22:09:37.748579 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 22:09:38 crc kubenswrapper[4830]: I1203 22:09:38.639424 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 22:09:38 crc kubenswrapper[4830]: I1203 22:09:38.890861 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.069044 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.248265 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.420589 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.472000 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.707754 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 22:09:39 crc kubenswrapper[4830]: I1203 22:09:39.954394 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.077197 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.277595 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.290858 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.305646 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.388820 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.489372 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.643425 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.804549 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 22:09:40 crc kubenswrapper[4830]: I1203 22:09:40.900099 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.006274 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.017736 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.107819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.118544 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.123851 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.145582 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.443540 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.455123 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.544641 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.567794 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.712746 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.798228 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.806102 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.817007 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.910491 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 22:09:41 crc kubenswrapper[4830]: I1203 22:09:41.975957 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.004615 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.109747 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.279008 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.439088 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.482798 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.606032 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.640871 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.670137 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.720229 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 22:09:42 crc kubenswrapper[4830]: I1203 22:09:42.879647 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.035017 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.166817 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.250822 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.396726 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.412492 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.473687 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.559728 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.702843 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.710424 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.760200 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.830840 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.869855 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 22:09:43 crc kubenswrapper[4830]: I1203 22:09:43.945823 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.009874 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.112141 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.124008 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.215019 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.337587 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.691097 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.728366 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.732454 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.770933 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 22:09:44 crc kubenswrapper[4830]: I1203 22:09:44.802186 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.008987 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.060748 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.103615 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.154890 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.157017 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.231228 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.234099 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.234079061 podStartE2EDuration="38.234079061s" podCreationTimestamp="2025-12-03 22:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:26.382017849 +0000 UTC m=+255.378479198" watchObservedRunningTime="2025-12-03 22:09:45.234079061 +0000 UTC m=+274.230540420" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.236883 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-lz46c"] Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.236944 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-6b9699fff8-7sdmc"] Dec 03 22:09:45 crc kubenswrapper[4830]: E1203 22:09:45.237182 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" containerName="installer" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237211 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" containerName="installer" Dec 03 22:09:45 crc kubenswrapper[4830]: E1203 22:09:45.237229 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerName="oauth-openshift" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237240 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerName="oauth-openshift" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237444 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237473 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="91f7641c-69eb-4471-b294-ed60f8362d7d" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237630 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" containerName="oauth-openshift" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.237664 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85574f9-5751-45ae-a62f-5c8cce45e669" containerName="installer" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.238276 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.239666 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.243622 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.243768 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.244607 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.244819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.245102 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.245566 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.245734 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.245825 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.245739 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.246066 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.246282 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.246733 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.249825 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.259831 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.266334 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.269562 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.285616 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.297381 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.29735436 podStartE2EDuration="19.29735436s" podCreationTimestamp="2025-12-03 22:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:45.290587527 +0000 UTC m=+274.287048876" watchObservedRunningTime="2025-12-03 22:09:45.29735436 +0000 UTC m=+274.293815749" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337404 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db446bae-ca46-4013-bb10-57126d9efd67-audit-dir\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337542 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337568 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337604 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qltw\" (UniqueName: \"kubernetes.io/projected/db446bae-ca46-4013-bb10-57126d9efd67-kube-api-access-5qltw\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337784 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337821 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-audit-policies\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337863 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.337966 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.338000 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.338043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.338195 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.344819 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1fa67c-db0a-4077-92ed-1b55beebf7c6" path="/var/lib/kubelet/pods/ef1fa67c-db0a-4077-92ed-1b55beebf7c6/volumes" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.347702 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.410007 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.441377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.441499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.441614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.441693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db446bae-ca46-4013-bb10-57126d9efd67-audit-dir\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.441969 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qltw\" (UniqueName: \"kubernetes.io/projected/db446bae-ca46-4013-bb10-57126d9efd67-kube-api-access-5qltw\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442155 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-audit-policies\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442298 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442407 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.442448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.446449 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db446bae-ca46-4013-bb10-57126d9efd67-audit-dir\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.446814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.446826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-audit-policies\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.448062 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.449761 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.452590 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.453287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.454718 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.454762 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.455077 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.455838 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.459279 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.460980 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db446bae-ca46-4013-bb10-57126d9efd67-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.466249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qltw\" (UniqueName: \"kubernetes.io/projected/db446bae-ca46-4013-bb10-57126d9efd67-kube-api-access-5qltw\") pod \"oauth-openshift-6b9699fff8-7sdmc\" (UID: \"db446bae-ca46-4013-bb10-57126d9efd67\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.547229 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.564687 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.592215 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.618272 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.674282 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.682648 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.684707 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.686802 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.694119 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.717664 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.788075 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.923077 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:09:45 crc kubenswrapper[4830]: I1203 22:09:45.989288 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.029258 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.037144 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.064069 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.226799 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.297784 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.314660 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.383350 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.383749 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.529603 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.594669 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.637381 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.669493 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.764699 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.797659 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.842642 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.877823 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.887610 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 22:09:46 crc kubenswrapper[4830]: I1203 22:09:46.981903 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.094012 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.102732 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.111623 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.121766 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.134717 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.190262 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.231570 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.248669 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.283690 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.400627 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.415187 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.490238 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.624120 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.688476 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.704626 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.783007 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.811827 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.862617 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 22:09:47 crc kubenswrapper[4830]: I1203 22:09:47.916374 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.013232 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.021141 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.050309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.057267 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.064323 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.073075 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.107309 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.159685 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.248728 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.318088 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.340659 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.353249 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.371053 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.376074 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.382663 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.490403 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.513270 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.525460 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.586673 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.605042 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.610902 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.639613 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: E1203 22:09:48.666105 4830 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 22:09:48 crc kubenswrapper[4830]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6b9699fff8-7sdmc_openshift-authentication_db446bae-ca46-4013-bb10-57126d9efd67_0(efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695): error adding pod openshift-authentication_oauth-openshift-6b9699fff8-7sdmc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695" Netns:"/var/run/netns/5186ebb3-04be-44f1-a7da-b0f07681e6ae" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6b9699fff8-7sdmc;K8S_POD_INFRA_CONTAINER_ID=efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695;K8S_POD_UID=db446bae-ca46-4013-bb10-57126d9efd67" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc] networking: Multus: [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc/db446bae-ca46-4013-bb10-57126d9efd67]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6b9699fff8-7sdmc in out of cluster comm: pod "oauth-openshift-6b9699fff8-7sdmc" not found Dec 03 22:09:48 crc kubenswrapper[4830]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:09:48 crc kubenswrapper[4830]: > Dec 03 22:09:48 crc kubenswrapper[4830]: E1203 22:09:48.666158 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 22:09:48 crc kubenswrapper[4830]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6b9699fff8-7sdmc_openshift-authentication_db446bae-ca46-4013-bb10-57126d9efd67_0(efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695): error adding pod openshift-authentication_oauth-openshift-6b9699fff8-7sdmc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695" Netns:"/var/run/netns/5186ebb3-04be-44f1-a7da-b0f07681e6ae" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6b9699fff8-7sdmc;K8S_POD_INFRA_CONTAINER_ID=efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695;K8S_POD_UID=db446bae-ca46-4013-bb10-57126d9efd67" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc] networking: Multus: [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc/db446bae-ca46-4013-bb10-57126d9efd67]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6b9699fff8-7sdmc in out of cluster comm: pod "oauth-openshift-6b9699fff8-7sdmc" not found Dec 03 22:09:48 crc kubenswrapper[4830]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:09:48 crc kubenswrapper[4830]: > pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:48 crc kubenswrapper[4830]: E1203 22:09:48.666179 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 22:09:48 crc kubenswrapper[4830]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6b9699fff8-7sdmc_openshift-authentication_db446bae-ca46-4013-bb10-57126d9efd67_0(efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695): error adding pod openshift-authentication_oauth-openshift-6b9699fff8-7sdmc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695" Netns:"/var/run/netns/5186ebb3-04be-44f1-a7da-b0f07681e6ae" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6b9699fff8-7sdmc;K8S_POD_INFRA_CONTAINER_ID=efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695;K8S_POD_UID=db446bae-ca46-4013-bb10-57126d9efd67" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc] networking: Multus: [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc/db446bae-ca46-4013-bb10-57126d9efd67]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6b9699fff8-7sdmc in out of cluster comm: pod "oauth-openshift-6b9699fff8-7sdmc" not found Dec 03 22:09:48 crc kubenswrapper[4830]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:09:48 crc kubenswrapper[4830]: > pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:48 crc kubenswrapper[4830]: E1203 22:09:48.666230 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6b9699fff8-7sdmc_openshift-authentication(db446bae-ca46-4013-bb10-57126d9efd67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6b9699fff8-7sdmc_openshift-authentication(db446bae-ca46-4013-bb10-57126d9efd67)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6b9699fff8-7sdmc_openshift-authentication_db446bae-ca46-4013-bb10-57126d9efd67_0(efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695): error adding pod openshift-authentication_oauth-openshift-6b9699fff8-7sdmc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695\\\" Netns:\\\"/var/run/netns/5186ebb3-04be-44f1-a7da-b0f07681e6ae\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6b9699fff8-7sdmc;K8S_POD_INFRA_CONTAINER_ID=efad57fa35b98bc268962a690a4346f043d9f4d5b9181894e7c07877cf6de695;K8S_POD_UID=db446bae-ca46-4013-bb10-57126d9efd67\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc] networking: Multus: [openshift-authentication/oauth-openshift-6b9699fff8-7sdmc/db446bae-ca46-4013-bb10-57126d9efd67]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6b9699fff8-7sdmc in out of cluster comm: pod \\\"oauth-openshift-6b9699fff8-7sdmc\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" podUID="db446bae-ca46-4013-bb10-57126d9efd67" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.666587 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.692590 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.790601 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.837222 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.864788 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.868035 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.868230 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ba8c3b6676b7d9b0a35863469fdf6d9894e269f42a34b6ac89533936345ab8df" gracePeriod=5 Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.922305 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.926605 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.947575 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.957067 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 22:09:48 crc kubenswrapper[4830]: I1203 22:09:48.991079 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.058154 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.263309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.264041 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.268267 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.281865 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.316054 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.436493 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.507734 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.524087 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.533048 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.606147 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.678077 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.679472 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.701764 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.751431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.890299 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 22:09:49 crc kubenswrapper[4830]: I1203 22:09:49.891700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.033802 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.163732 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.422990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.433797 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.572240 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.607665 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.637116 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.651247 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.712400 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.776822 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.790849 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.848373 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.887539 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.943112 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 22:09:50 crc kubenswrapper[4830]: I1203 22:09:50.997482 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.104154 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.149204 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.287979 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.372193 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.401423 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.442850 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.499298 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.517079 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.719730 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.750767 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.755334 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 22:09:51 crc kubenswrapper[4830]: I1203 22:09:51.891776 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.080670 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.087703 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.197349 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.393357 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.393656 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.477479 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.526730 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.613336 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.733270 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.733996 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.815004 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 22:09:52 crc kubenswrapper[4830]: I1203 22:09:52.905063 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.280126 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.293698 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.311214 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.336882 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.409633 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 22:09:53 crc kubenswrapper[4830]: I1203 22:09:53.731635 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.045132 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.045413 4830 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ba8c3b6676b7d9b0a35863469fdf6d9894e269f42a34b6ac89533936345ab8df" exitCode=137 Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.193328 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.398131 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.428718 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.443777 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.443845 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.457700 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.457826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.458367 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.458573 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.459402 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.459551 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.459680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.460108 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.460145 4830 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.460202 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.460245 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.463473 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.474167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.561764 4830 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.561830 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.561858 4830 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.579711 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.602849 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.712203 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.738194 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:09:54 crc kubenswrapper[4830]: I1203 22:09:54.813840 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.052109 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.053736 4830 scope.go:117] "RemoveContainer" containerID="ba8c3b6676b7d9b0a35863469fdf6d9894e269f42a34b6ac89533936345ab8df" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.053856 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.266772 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.346004 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.346250 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.372578 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.372636 4830 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="f0b9d8c2-1b00-497c-b084-1cf053af4d96" Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.376691 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 22:09:55 crc kubenswrapper[4830]: I1203 22:09:55.376745 4830 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="f0b9d8c2-1b00-497c-b084-1cf053af4d96" Dec 03 22:09:59 crc kubenswrapper[4830]: I1203 22:09:59.336618 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:59 crc kubenswrapper[4830]: I1203 22:09:59.337321 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:09:59 crc kubenswrapper[4830]: I1203 22:09:59.631006 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-7sdmc"] Dec 03 22:10:00 crc kubenswrapper[4830]: I1203 22:10:00.098535 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" event={"ID":"db446bae-ca46-4013-bb10-57126d9efd67","Type":"ContainerStarted","Data":"1476398ae7194a01f2b674e037e9f1c75d11427d2b054732864999829ce7c13f"} Dec 03 22:10:00 crc kubenswrapper[4830]: I1203 22:10:00.098598 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" event={"ID":"db446bae-ca46-4013-bb10-57126d9efd67","Type":"ContainerStarted","Data":"c51669ab35925fd71dcd58e651055bbce3dc8a3bc888e6e4e7aad7a0d97f8ade"} Dec 03 22:10:00 crc kubenswrapper[4830]: I1203 22:10:00.098973 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:10:00 crc kubenswrapper[4830]: I1203 22:10:00.141366 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" podStartSLOduration=59.141336646 podStartE2EDuration="59.141336646s" podCreationTimestamp="2025-12-03 22:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:00.132757524 +0000 UTC m=+289.129218903" watchObservedRunningTime="2025-12-03 22:10:00.141336646 +0000 UTC m=+289.137798035" Dec 03 22:10:00 crc kubenswrapper[4830]: I1203 22:10:00.590967 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b9699fff8-7sdmc" Dec 03 22:10:11 crc kubenswrapper[4830]: I1203 22:10:11.173475 4830 generic.go:334] "Generic (PLEG): container finished" podID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerID="fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9" exitCode=0 Dec 03 22:10:11 crc kubenswrapper[4830]: I1203 22:10:11.173595 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerDied","Data":"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9"} Dec 03 22:10:11 crc kubenswrapper[4830]: I1203 22:10:11.175413 4830 scope.go:117] "RemoveContainer" containerID="fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9" Dec 03 22:10:12 crc kubenswrapper[4830]: I1203 22:10:12.181586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerStarted","Data":"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d"} Dec 03 22:10:12 crc kubenswrapper[4830]: I1203 22:10:12.183328 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:10:12 crc kubenswrapper[4830]: I1203 22:10:12.228427 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.167080 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.167663 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" podUID="6be04ea3-029e-4aab-b86c-211ef277f024" containerName="controller-manager" containerID="cri-o://d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400" gracePeriod=30 Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.263285 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.263804 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" podUID="aeb240dc-cbe3-4b23-b806-4296015a46f0" containerName="route-controller-manager" containerID="cri-o://e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29" gracePeriod=30 Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.641648 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.647085 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810125 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert\") pod \"aeb240dc-cbe3-4b23-b806-4296015a46f0\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810175 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config\") pod \"6be04ea3-029e-4aab-b86c-211ef277f024\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810198 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca\") pod \"6be04ea3-029e-4aab-b86c-211ef277f024\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810221 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config\") pod \"aeb240dc-cbe3-4b23-b806-4296015a46f0\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles\") pod \"6be04ea3-029e-4aab-b86c-211ef277f024\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert\") pod \"6be04ea3-029e-4aab-b86c-211ef277f024\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810299 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zgps\" (UniqueName: \"kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps\") pod \"aeb240dc-cbe3-4b23-b806-4296015a46f0\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca\") pod \"aeb240dc-cbe3-4b23-b806-4296015a46f0\" (UID: \"aeb240dc-cbe3-4b23-b806-4296015a46f0\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.810355 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrzs4\" (UniqueName: \"kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4\") pod \"6be04ea3-029e-4aab-b86c-211ef277f024\" (UID: \"6be04ea3-029e-4aab-b86c-211ef277f024\") " Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.811106 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca" (OuterVolumeSpecName: "client-ca") pod "6be04ea3-029e-4aab-b86c-211ef277f024" (UID: "6be04ea3-029e-4aab-b86c-211ef277f024"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.811467 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config" (OuterVolumeSpecName: "config") pod "6be04ea3-029e-4aab-b86c-211ef277f024" (UID: "6be04ea3-029e-4aab-b86c-211ef277f024"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.812088 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "aeb240dc-cbe3-4b23-b806-4296015a46f0" (UID: "aeb240dc-cbe3-4b23-b806-4296015a46f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.812757 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config" (OuterVolumeSpecName: "config") pod "aeb240dc-cbe3-4b23-b806-4296015a46f0" (UID: "aeb240dc-cbe3-4b23-b806-4296015a46f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.812949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6be04ea3-029e-4aab-b86c-211ef277f024" (UID: "6be04ea3-029e-4aab-b86c-211ef277f024"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.816490 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aeb240dc-cbe3-4b23-b806-4296015a46f0" (UID: "aeb240dc-cbe3-4b23-b806-4296015a46f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.816683 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps" (OuterVolumeSpecName: "kube-api-access-5zgps") pod "aeb240dc-cbe3-4b23-b806-4296015a46f0" (UID: "aeb240dc-cbe3-4b23-b806-4296015a46f0"). InnerVolumeSpecName "kube-api-access-5zgps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.816944 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4" (OuterVolumeSpecName: "kube-api-access-xrzs4") pod "6be04ea3-029e-4aab-b86c-211ef277f024" (UID: "6be04ea3-029e-4aab-b86c-211ef277f024"). InnerVolumeSpecName "kube-api-access-xrzs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.817471 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6be04ea3-029e-4aab-b86c-211ef277f024" (UID: "6be04ea3-029e-4aab-b86c-211ef277f024"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911625 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be04ea3-029e-4aab-b86c-211ef277f024-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911850 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zgps\" (UniqueName: \"kubernetes.io/projected/aeb240dc-cbe3-4b23-b806-4296015a46f0-kube-api-access-5zgps\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911860 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911869 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrzs4\" (UniqueName: \"kubernetes.io/projected/6be04ea3-029e-4aab-b86c-211ef277f024-kube-api-access-xrzs4\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911878 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeb240dc-cbe3-4b23-b806-4296015a46f0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911888 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911897 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911904 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeb240dc-cbe3-4b23-b806-4296015a46f0-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:18 crc kubenswrapper[4830]: I1203 22:10:18.911914 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6be04ea3-029e-4aab-b86c-211ef277f024-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.227952 4830 generic.go:334] "Generic (PLEG): container finished" podID="aeb240dc-cbe3-4b23-b806-4296015a46f0" containerID="e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29" exitCode=0 Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.228009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" event={"ID":"aeb240dc-cbe3-4b23-b806-4296015a46f0","Type":"ContainerDied","Data":"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29"} Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.228035 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" event={"ID":"aeb240dc-cbe3-4b23-b806-4296015a46f0","Type":"ContainerDied","Data":"03ff5b4b6cc3e425faeb631733e94d0cdec582f6ace46b2c5194e2fa8917bd90"} Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.228050 4830 scope.go:117] "RemoveContainer" containerID="e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.228143 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.232844 4830 generic.go:334] "Generic (PLEG): container finished" podID="6be04ea3-029e-4aab-b86c-211ef277f024" containerID="d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400" exitCode=0 Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.232868 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" event={"ID":"6be04ea3-029e-4aab-b86c-211ef277f024","Type":"ContainerDied","Data":"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400"} Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.232881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" event={"ID":"6be04ea3-029e-4aab-b86c-211ef277f024","Type":"ContainerDied","Data":"a5a6b713571dd3719105a9ee423caf12f528f674714f4eb1ea4712bef4080b2a"} Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.232915 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lv6mk" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.244690 4830 scope.go:117] "RemoveContainer" containerID="e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29" Dec 03 22:10:19 crc kubenswrapper[4830]: E1203 22:10:19.246026 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29\": container with ID starting with e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29 not found: ID does not exist" containerID="e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.246089 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29"} err="failed to get container status \"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29\": rpc error: code = NotFound desc = could not find container \"e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29\": container with ID starting with e7c5c847983b6a246ab863bdf4634e1a8167969d8242cde8d23bad31d83cfe29 not found: ID does not exist" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.246111 4830 scope.go:117] "RemoveContainer" containerID="d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.268169 4830 scope.go:117] "RemoveContainer" containerID="d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400" Dec 03 22:10:19 crc kubenswrapper[4830]: E1203 22:10:19.268543 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400\": container with ID starting with d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400 not found: ID does not exist" containerID="d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.268570 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400"} err="failed to get container status \"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400\": rpc error: code = NotFound desc = could not find container \"d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400\": container with ID starting with d98eddd36a0e55c63d9a0843d213b848e1d8b2989d7f5cb9b5e601a557073400 not found: ID does not exist" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.272752 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.274496 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xtxmm"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.282269 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.285569 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lv6mk"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.319693 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:19 crc kubenswrapper[4830]: E1203 22:10:19.319971 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb240dc-cbe3-4b23-b806-4296015a46f0" containerName="route-controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.319989 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb240dc-cbe3-4b23-b806-4296015a46f0" containerName="route-controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: E1203 22:10:19.320003 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320012 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 22:10:19 crc kubenswrapper[4830]: E1203 22:10:19.320026 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be04ea3-029e-4aab-b86c-211ef277f024" containerName="controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320032 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be04ea3-029e-4aab-b86c-211ef277f024" containerName="controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320123 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be04ea3-029e-4aab-b86c-211ef277f024" containerName="controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320132 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb240dc-cbe3-4b23-b806-4296015a46f0" containerName="route-controller-manager" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320143 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.320791 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.322609 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.322893 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.322980 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.322984 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.323210 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.323412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.366428 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.366825 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.366905 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.367038 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.367076 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.367129 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.367248 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.367372 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.369387 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.399778 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be04ea3-029e-4aab-b86c-211ef277f024" path="/var/lib/kubelet/pods/6be04ea3-029e-4aab-b86c-211ef277f024/volumes" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.400291 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb240dc-cbe3-4b23-b806-4296015a46f0" path="/var/lib/kubelet/pods/aeb240dc-cbe3-4b23-b806-4296015a46f0/volumes" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.400686 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.400714 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417673 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417734 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417817 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgv87\" (UniqueName: \"kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lvb\" (UniqueName: \"kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417939 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.417990 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.418037 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.418063 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.418084 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519615 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgv87\" (UniqueName: \"kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519742 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lvb\" (UniqueName: \"kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519788 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519959 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.519982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.520070 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.522593 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.523270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.523527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.524141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.525985 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.528268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.540417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.543063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgv87\" (UniqueName: \"kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87\") pod \"route-controller-manager-56cb849f86-j7tdz\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.544474 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lvb\" (UniqueName: \"kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb\") pod \"controller-manager-9d76ff8cd-8xfgh\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.702750 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:19 crc kubenswrapper[4830]: I1203 22:10:19.713737 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.006976 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.153008 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:20 crc kubenswrapper[4830]: W1203 22:10:20.154928 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac74d2f2_eaea_46ea_b2aa_e4d897af856e.slice/crio-a36e27331284e65288eeca0dcc2bf9e8cd1cd8ed2df8cf86ae50f54646abb01c WatchSource:0}: Error finding container a36e27331284e65288eeca0dcc2bf9e8cd1cd8ed2df8cf86ae50f54646abb01c: Status 404 returned error can't find the container with id a36e27331284e65288eeca0dcc2bf9e8cd1cd8ed2df8cf86ae50f54646abb01c Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.248170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" event={"ID":"ac74d2f2-eaea-46ea-b2aa-e4d897af856e","Type":"ContainerStarted","Data":"a36e27331284e65288eeca0dcc2bf9e8cd1cd8ed2df8cf86ae50f54646abb01c"} Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.250032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" event={"ID":"99c6921e-5468-4501-a93b-d4116e2e4f3b","Type":"ContainerStarted","Data":"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603"} Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.250082 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" event={"ID":"99c6921e-5468-4501-a93b-d4116e2e4f3b","Type":"ContainerStarted","Data":"7e1ba8aab68ec8ebc1cf2d2001654c90bdf507cc32eb06762d5069f4e3ea8f82"} Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.250877 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.257317 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:20 crc kubenswrapper[4830]: I1203 22:10:20.270926 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" podStartSLOduration=2.270911367 podStartE2EDuration="2.270911367s" podCreationTimestamp="2025-12-03 22:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:20.267612108 +0000 UTC m=+309.264073497" watchObservedRunningTime="2025-12-03 22:10:20.270911367 +0000 UTC m=+309.267372716" Dec 03 22:10:21 crc kubenswrapper[4830]: I1203 22:10:21.257927 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" event={"ID":"ac74d2f2-eaea-46ea-b2aa-e4d897af856e","Type":"ContainerStarted","Data":"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d"} Dec 03 22:10:21 crc kubenswrapper[4830]: I1203 22:10:21.280245 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" podStartSLOduration=3.28021893 podStartE2EDuration="3.28021893s" podCreationTimestamp="2025-12-03 22:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:21.276810597 +0000 UTC m=+310.273271956" watchObservedRunningTime="2025-12-03 22:10:21.28021893 +0000 UTC m=+310.276680319" Dec 03 22:10:22 crc kubenswrapper[4830]: I1203 22:10:22.262755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:22 crc kubenswrapper[4830]: I1203 22:10:22.269752 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.257963 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.265257 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.267148 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" podUID="99c6921e-5468-4501-a93b-d4116e2e4f3b" containerName="controller-manager" containerID="cri-o://4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603" gracePeriod=30 Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.701581 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.774239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles\") pod \"99c6921e-5468-4501-a93b-d4116e2e4f3b\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.774305 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert\") pod \"99c6921e-5468-4501-a93b-d4116e2e4f3b\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.774338 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca\") pod \"99c6921e-5468-4501-a93b-d4116e2e4f3b\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.774387 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lvb\" (UniqueName: \"kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb\") pod \"99c6921e-5468-4501-a93b-d4116e2e4f3b\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.774415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config\") pod \"99c6921e-5468-4501-a93b-d4116e2e4f3b\" (UID: \"99c6921e-5468-4501-a93b-d4116e2e4f3b\") " Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.775402 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "99c6921e-5468-4501-a93b-d4116e2e4f3b" (UID: "99c6921e-5468-4501-a93b-d4116e2e4f3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.775552 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config" (OuterVolumeSpecName: "config") pod "99c6921e-5468-4501-a93b-d4116e2e4f3b" (UID: "99c6921e-5468-4501-a93b-d4116e2e4f3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.775596 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99c6921e-5468-4501-a93b-d4116e2e4f3b" (UID: "99c6921e-5468-4501-a93b-d4116e2e4f3b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.779658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb" (OuterVolumeSpecName: "kube-api-access-m2lvb") pod "99c6921e-5468-4501-a93b-d4116e2e4f3b" (UID: "99c6921e-5468-4501-a93b-d4116e2e4f3b"). InnerVolumeSpecName "kube-api-access-m2lvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.780163 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99c6921e-5468-4501-a93b-d4116e2e4f3b" (UID: "99c6921e-5468-4501-a93b-d4116e2e4f3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.875783 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.875835 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c6921e-5468-4501-a93b-d4116e2e4f3b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.875854 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.875873 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lvb\" (UniqueName: \"kubernetes.io/projected/99c6921e-5468-4501-a93b-d4116e2e4f3b-kube-api-access-m2lvb\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:23 crc kubenswrapper[4830]: I1203 22:10:23.875893 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c6921e-5468-4501-a93b-d4116e2e4f3b-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.276564 4830 generic.go:334] "Generic (PLEG): container finished" podID="99c6921e-5468-4501-a93b-d4116e2e4f3b" containerID="4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603" exitCode=0 Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.276659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.276695 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" event={"ID":"99c6921e-5468-4501-a93b-d4116e2e4f3b","Type":"ContainerDied","Data":"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603"} Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.276782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh" event={"ID":"99c6921e-5468-4501-a93b-d4116e2e4f3b","Type":"ContainerDied","Data":"7e1ba8aab68ec8ebc1cf2d2001654c90bdf507cc32eb06762d5069f4e3ea8f82"} Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.276818 4830 scope.go:117] "RemoveContainer" containerID="4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.277426 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" podUID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" containerName="route-controller-manager" containerID="cri-o://49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d" gracePeriod=30 Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.308406 4830 scope.go:117] "RemoveContainer" containerID="4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603" Dec 03 22:10:24 crc kubenswrapper[4830]: E1203 22:10:24.309702 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603\": container with ID starting with 4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603 not found: ID does not exist" containerID="4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.309751 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603"} err="failed to get container status \"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603\": rpc error: code = NotFound desc = could not find container \"4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603\": container with ID starting with 4f99f8da5a971833448d6abfc1559153ed8e870dfc74f0902fa3794c3ba83603 not found: ID does not exist" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.333273 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.339961 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9d76ff8cd-8xfgh"] Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.735225 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.789452 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config" (OuterVolumeSpecName: "config") pod "ac74d2f2-eaea-46ea-b2aa-e4d897af856e" (UID: "ac74d2f2-eaea-46ea-b2aa-e4d897af856e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.789607 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config\") pod \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.789864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert\") pod \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.789921 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgv87\" (UniqueName: \"kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87\") pod \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.789980 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca\") pod \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\" (UID: \"ac74d2f2-eaea-46ea-b2aa-e4d897af856e\") " Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.790293 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.790790 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac74d2f2-eaea-46ea-b2aa-e4d897af856e" (UID: "ac74d2f2-eaea-46ea-b2aa-e4d897af856e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.798685 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87" (OuterVolumeSpecName: "kube-api-access-rgv87") pod "ac74d2f2-eaea-46ea-b2aa-e4d897af856e" (UID: "ac74d2f2-eaea-46ea-b2aa-e4d897af856e"). InnerVolumeSpecName "kube-api-access-rgv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.800727 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac74d2f2-eaea-46ea-b2aa-e4d897af856e" (UID: "ac74d2f2-eaea-46ea-b2aa-e4d897af856e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.891733 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.891791 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgv87\" (UniqueName: \"kubernetes.io/projected/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-kube-api-access-rgv87\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:24 crc kubenswrapper[4830]: I1203 22:10:24.891813 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac74d2f2-eaea-46ea-b2aa-e4d897af856e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.287024 4830 generic.go:334] "Generic (PLEG): container finished" podID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" containerID="49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d" exitCode=0 Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.287161 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.287212 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" event={"ID":"ac74d2f2-eaea-46ea-b2aa-e4d897af856e","Type":"ContainerDied","Data":"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d"} Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.287313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz" event={"ID":"ac74d2f2-eaea-46ea-b2aa-e4d897af856e","Type":"ContainerDied","Data":"a36e27331284e65288eeca0dcc2bf9e8cd1cd8ed2df8cf86ae50f54646abb01c"} Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.287355 4830 scope.go:117] "RemoveContainer" containerID="49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.326100 4830 scope.go:117] "RemoveContainer" containerID="49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d" Dec 03 22:10:25 crc kubenswrapper[4830]: E1203 22:10:25.327843 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d\": container with ID starting with 49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d not found: ID does not exist" containerID="49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.327903 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d"} err="failed to get container status \"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d\": rpc error: code = NotFound desc = could not find container \"49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d\": container with ID starting with 49da561b820a020433a781742e1d7dc78fd572acbe605a1e12a2c55d5be4381d not found: ID does not exist" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.331658 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:25 crc kubenswrapper[4830]: E1203 22:10:25.332096 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" containerName="route-controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.332126 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" containerName="route-controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: E1203 22:10:25.332151 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c6921e-5468-4501-a93b-d4116e2e4f3b" containerName="controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.332165 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c6921e-5468-4501-a93b-d4116e2e4f3b" containerName="controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.332341 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" containerName="route-controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.332366 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c6921e-5468-4501-a93b-d4116e2e4f3b" containerName="controller-manager" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.333085 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.335751 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.335935 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.336628 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.336836 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.337699 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.338763 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.339027 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.345864 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.349250 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.349383 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.349383 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.351943 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.352252 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.352343 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.363552 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c6921e-5468-4501-a93b-d4116e2e4f3b" path="/var/lib/kubelet/pods/99c6921e-5468-4501-a93b-d4116e2e4f3b/volumes" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.364781 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.377763 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.384754 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399161 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399218 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2n2l\" (UniqueName: \"kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399489 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399574 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399613 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.399783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwtm\" (UniqueName: \"kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.415984 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.423154 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cb849f86-j7tdz"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500766 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2n2l\" (UniqueName: \"kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500880 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.500925 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwtm\" (UniqueName: \"kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.502391 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.502645 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.503130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.503314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.504138 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.506712 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.506970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.519141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwtm\" (UniqueName: \"kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm\") pod \"route-controller-manager-7869657cf5-8sbn7\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.527425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2n2l\" (UniqueName: \"kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l\") pod \"controller-manager-5fcb6fc779-nl2s9\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.686109 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.705831 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.909209 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:25 crc kubenswrapper[4830]: I1203 22:10:25.983886 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:25 crc kubenswrapper[4830]: W1203 22:10:25.994601 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a0e70e3_0dbc_4e5f_a6ac_a2335e5f58d5.slice/crio-735f3836ddf6273c42a5438f23e0efa46d2594b2c3bf8e0054a625c09047b147 WatchSource:0}: Error finding container 735f3836ddf6273c42a5438f23e0efa46d2594b2c3bf8e0054a625c09047b147: Status 404 returned error can't find the container with id 735f3836ddf6273c42a5438f23e0efa46d2594b2c3bf8e0054a625c09047b147 Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.296490 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" event={"ID":"eb008f98-5977-4a3d-aecc-0e3604f2e92b","Type":"ContainerStarted","Data":"6b2d8ef1d2664923db247a16f0f65404523ae7de6da6c6683fd7b8663289ce4b"} Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.296808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" event={"ID":"eb008f98-5977-4a3d-aecc-0e3604f2e92b","Type":"ContainerStarted","Data":"c5eaf3145c74290498306616565ea8d73a9d9fe707434f6e52b3b8959277a84e"} Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.297311 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.298251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" event={"ID":"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5","Type":"ContainerStarted","Data":"f93122a9e837094d91ff7b076574205291644524740ac1392bb6d08dd671b1f1"} Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.298273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" event={"ID":"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5","Type":"ContainerStarted","Data":"735f3836ddf6273c42a5438f23e0efa46d2594b2c3bf8e0054a625c09047b147"} Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.298434 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.306253 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.324641 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" podStartSLOduration=3.324623891 podStartE2EDuration="3.324623891s" podCreationTimestamp="2025-12-03 22:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:26.321575799 +0000 UTC m=+315.318037138" watchObservedRunningTime="2025-12-03 22:10:26.324623891 +0000 UTC m=+315.321085240" Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.343331 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" podStartSLOduration=3.343311096 podStartE2EDuration="3.343311096s" podCreationTimestamp="2025-12-03 22:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:26.342148965 +0000 UTC m=+315.338610314" watchObservedRunningTime="2025-12-03 22:10:26.343311096 +0000 UTC m=+315.339772445" Dec 03 22:10:26 crc kubenswrapper[4830]: I1203 22:10:26.643034 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:27 crc kubenswrapper[4830]: I1203 22:10:27.348379 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac74d2f2-eaea-46ea-b2aa-e4d897af856e" path="/var/lib/kubelet/pods/ac74d2f2-eaea-46ea-b2aa-e4d897af856e/volumes" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.171154 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.171900 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" podUID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" containerName="controller-manager" containerID="cri-o://6b2d8ef1d2664923db247a16f0f65404523ae7de6da6c6683fd7b8663289ce4b" gracePeriod=30 Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.185128 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.185402 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" podUID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" containerName="route-controller-manager" containerID="cri-o://f93122a9e837094d91ff7b076574205291644524740ac1392bb6d08dd671b1f1" gracePeriod=30 Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.380740 4830 generic.go:334] "Generic (PLEG): container finished" podID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" containerID="6b2d8ef1d2664923db247a16f0f65404523ae7de6da6c6683fd7b8663289ce4b" exitCode=0 Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.380841 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" event={"ID":"eb008f98-5977-4a3d-aecc-0e3604f2e92b","Type":"ContainerDied","Data":"6b2d8ef1d2664923db247a16f0f65404523ae7de6da6c6683fd7b8663289ce4b"} Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.383487 4830 generic.go:334] "Generic (PLEG): container finished" podID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" containerID="f93122a9e837094d91ff7b076574205291644524740ac1392bb6d08dd671b1f1" exitCode=0 Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.383552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" event={"ID":"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5","Type":"ContainerDied","Data":"f93122a9e837094d91ff7b076574205291644524740ac1392bb6d08dd671b1f1"} Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.715118 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.793839 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert\") pod \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.793925 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pwtm\" (UniqueName: \"kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm\") pod \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.794017 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config\") pod \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.794072 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca\") pod \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\" (UID: \"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.795035 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config" (OuterVolumeSpecName: "config") pod "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" (UID: "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.795183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" (UID: "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.801330 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" (UID: "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.801617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm" (OuterVolumeSpecName: "kube-api-access-2pwtm") pod "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" (UID: "1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5"). InnerVolumeSpecName "kube-api-access-2pwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.852568 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.895367 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.895402 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pwtm\" (UniqueName: \"kubernetes.io/projected/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-kube-api-access-2pwtm\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.895415 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.895423 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.996063 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config\") pod \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.996196 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert\") pod \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.996275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca\") pod \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.996318 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles\") pod \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.996453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2n2l\" (UniqueName: \"kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l\") pod \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\" (UID: \"eb008f98-5977-4a3d-aecc-0e3604f2e92b\") " Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.997192 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config" (OuterVolumeSpecName: "config") pod "eb008f98-5977-4a3d-aecc-0e3604f2e92b" (UID: "eb008f98-5977-4a3d-aecc-0e3604f2e92b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.997228 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb008f98-5977-4a3d-aecc-0e3604f2e92b" (UID: "eb008f98-5977-4a3d-aecc-0e3604f2e92b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.997342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb008f98-5977-4a3d-aecc-0e3604f2e92b" (UID: "eb008f98-5977-4a3d-aecc-0e3604f2e92b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:10:38 crc kubenswrapper[4830]: I1203 22:10:38.999688 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb008f98-5977-4a3d-aecc-0e3604f2e92b" (UID: "eb008f98-5977-4a3d-aecc-0e3604f2e92b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.000820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l" (OuterVolumeSpecName: "kube-api-access-j2n2l") pod "eb008f98-5977-4a3d-aecc-0e3604f2e92b" (UID: "eb008f98-5977-4a3d-aecc-0e3604f2e92b"). InnerVolumeSpecName "kube-api-access-j2n2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.097895 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2n2l\" (UniqueName: \"kubernetes.io/projected/eb008f98-5977-4a3d-aecc-0e3604f2e92b-kube-api-access-j2n2l\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.097969 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.097999 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb008f98-5977-4a3d-aecc-0e3604f2e92b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.098025 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.098052 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb008f98-5977-4a3d-aecc-0e3604f2e92b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.357883 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q"] Dec 03 22:10:39 crc kubenswrapper[4830]: E1203 22:10:39.358049 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" containerName="route-controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358060 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" containerName="route-controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: E1203 22:10:39.358083 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" containerName="controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358089 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" containerName="controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358176 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" containerName="controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358190 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" containerName="route-controller-manager" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358470 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.358915 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.360656 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.364525 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.368180 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.397313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" event={"ID":"1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5","Type":"ContainerDied","Data":"735f3836ddf6273c42a5438f23e0efa46d2594b2c3bf8e0054a625c09047b147"} Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.397365 4830 scope.go:117] "RemoveContainer" containerID="f93122a9e837094d91ff7b076574205291644524740ac1392bb6d08dd671b1f1" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.397541 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.401038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" event={"ID":"eb008f98-5977-4a3d-aecc-0e3604f2e92b","Type":"ContainerDied","Data":"c5eaf3145c74290498306616565ea8d73a9d9fe707434f6e52b3b8959277a84e"} Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.401111 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.415379 4830 scope.go:117] "RemoveContainer" containerID="6b2d8ef1d2664923db247a16f0f65404523ae7de6da6c6683fd7b8663289ce4b" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.429525 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.435230 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7869657cf5-8sbn7"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.438858 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.442207 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fcb6fc779-nl2s9"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502224 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfecbde-fc1d-4ec6-b345-39e869081a16-serving-cert\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502276 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-config\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502315 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-client-ca\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-client-ca\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502358 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-proxy-ca-bundles\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhr8s\" (UniqueName: \"kubernetes.io/projected/84d86810-a62c-4155-9029-8d4c9f8be962-kube-api-access-nhr8s\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-config\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502427 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8jm\" (UniqueName: \"kubernetes.io/projected/4cfecbde-fc1d-4ec6-b345-39e869081a16-kube-api-access-cg8jm\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.502452 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d86810-a62c-4155-9029-8d4c9f8be962-serving-cert\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.603719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-client-ca\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604085 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-client-ca\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604223 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-proxy-ca-bundles\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604317 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhr8s\" (UniqueName: \"kubernetes.io/projected/84d86810-a62c-4155-9029-8d4c9f8be962-kube-api-access-nhr8s\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-config\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8jm\" (UniqueName: \"kubernetes.io/projected/4cfecbde-fc1d-4ec6-b345-39e869081a16-kube-api-access-cg8jm\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d86810-a62c-4155-9029-8d4c9f8be962-serving-cert\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604835 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfecbde-fc1d-4ec6-b345-39e869081a16-serving-cert\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.604968 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-config\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.606026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-client-ca\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.606277 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-proxy-ca-bundles\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.606533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-config\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.606500 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4cfecbde-fc1d-4ec6-b345-39e869081a16-client-ca\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.607932 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d86810-a62c-4155-9029-8d4c9f8be962-config\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.616624 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d86810-a62c-4155-9029-8d4c9f8be962-serving-cert\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.616844 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cfecbde-fc1d-4ec6-b345-39e869081a16-serving-cert\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.629603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8jm\" (UniqueName: \"kubernetes.io/projected/4cfecbde-fc1d-4ec6-b345-39e869081a16-kube-api-access-cg8jm\") pod \"route-controller-manager-7b77444494-thmdn\" (UID: \"4cfecbde-fc1d-4ec6-b345-39e869081a16\") " pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.632908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhr8s\" (UniqueName: \"kubernetes.io/projected/84d86810-a62c-4155-9029-8d4c9f8be962-kube-api-access-nhr8s\") pod \"controller-manager-7c68f9dccd-qdz4q\" (UID: \"84d86810-a62c-4155-9029-8d4c9f8be962\") " pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.679890 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.698390 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.879930 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn"] Dec 03 22:10:39 crc kubenswrapper[4830]: I1203 22:10:39.929294 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q"] Dec 03 22:10:39 crc kubenswrapper[4830]: W1203 22:10:39.958779 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d86810_a62c_4155_9029_8d4c9f8be962.slice/crio-2c5789b0430c63d3f194edac882b095afb6c236abdcf3a40ca579c0a0b0d7325 WatchSource:0}: Error finding container 2c5789b0430c63d3f194edac882b095afb6c236abdcf3a40ca579c0a0b0d7325: Status 404 returned error can't find the container with id 2c5789b0430c63d3f194edac882b095afb6c236abdcf3a40ca579c0a0b0d7325 Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.406635 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" event={"ID":"4cfecbde-fc1d-4ec6-b345-39e869081a16","Type":"ContainerStarted","Data":"8fdef7e8f17dcae76333e0f052a43351790357155b0b035cf959c22b72f4e75c"} Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.406873 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.406885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" event={"ID":"4cfecbde-fc1d-4ec6-b345-39e869081a16","Type":"ContainerStarted","Data":"c9d0c977a73c33ac7dc775c3b961b4c6779cab131efbdbf261e32ef18c868862"} Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.409161 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" event={"ID":"84d86810-a62c-4155-9029-8d4c9f8be962","Type":"ContainerStarted","Data":"39528c974ebdabaf7933d1f882895f29b6dd02a79cdbb91736af2f103e9a2e05"} Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.409196 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" event={"ID":"84d86810-a62c-4155-9029-8d4c9f8be962","Type":"ContainerStarted","Data":"2c5789b0430c63d3f194edac882b095afb6c236abdcf3a40ca579c0a0b0d7325"} Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.409384 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.426726 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.451435 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" podStartSLOduration=2.451418872 podStartE2EDuration="2.451418872s" podCreationTimestamp="2025-12-03 22:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:40.43630023 +0000 UTC m=+329.432761579" watchObservedRunningTime="2025-12-03 22:10:40.451418872 +0000 UTC m=+329.447880221" Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.452246 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c68f9dccd-qdz4q" podStartSLOduration=2.452237584 podStartE2EDuration="2.452237584s" podCreationTimestamp="2025-12-03 22:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:40.451272828 +0000 UTC m=+329.447734187" watchObservedRunningTime="2025-12-03 22:10:40.452237584 +0000 UTC m=+329.448698933" Dec 03 22:10:40 crc kubenswrapper[4830]: I1203 22:10:40.502620 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b77444494-thmdn" Dec 03 22:10:41 crc kubenswrapper[4830]: I1203 22:10:41.343841 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5" path="/var/lib/kubelet/pods/1a0e70e3-0dbc-4e5f-a6ac-a2335e5f58d5/volumes" Dec 03 22:10:41 crc kubenswrapper[4830]: I1203 22:10:41.344338 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb008f98-5977-4a3d-aecc-0e3604f2e92b" path="/var/lib/kubelet/pods/eb008f98-5977-4a3d-aecc-0e3604f2e92b/volumes" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.247984 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.248846 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mcjzb" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="registry-server" containerID="cri-o://9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854" gracePeriod=30 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.268901 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.269436 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gxd8v" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="registry-server" containerID="cri-o://a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816" gracePeriod=30 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.281725 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.281971 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" containerID="cri-o://4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d" gracePeriod=30 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.285376 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.285590 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j7zzm" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="registry-server" containerID="cri-o://c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" gracePeriod=30 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.293825 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.294169 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nvnwt" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="registry-server" containerID="cri-o://1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db" gracePeriod=30 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.298027 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7s8ph"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.298671 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.301924 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7s8ph"] Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.473678 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c is running failed: container process not found" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.474867 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c is running failed: container process not found" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.475083 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c is running failed: container process not found" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.475117 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-j7zzm" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="registry-server" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.487874 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.487978 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.488483 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrbxp\" (UniqueName: \"kubernetes.io/projected/96069a0e-4ce1-4f68-835c-0a0110f36b2c-kube-api-access-mrbxp\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.589875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.589946 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.589967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrbxp\" (UniqueName: \"kubernetes.io/projected/96069a0e-4ce1-4f68-835c-0a0110f36b2c-kube-api-access-mrbxp\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.591671 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.601928 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96069a0e-4ce1-4f68-835c-0a0110f36b2c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.608416 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrbxp\" (UniqueName: \"kubernetes.io/projected/96069a0e-4ce1-4f68-835c-0a0110f36b2c-kube-api-access-mrbxp\") pod \"marketplace-operator-79b997595-7s8ph\" (UID: \"96069a0e-4ce1-4f68-835c-0a0110f36b2c\") " pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.644978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.749320 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.796572 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities\") pod \"39a2a67d-4e6e-4514-9304-966057dd71bf\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.796617 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content\") pod \"39a2a67d-4e6e-4514-9304-966057dd71bf\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.796659 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25g85\" (UniqueName: \"kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85\") pod \"39a2a67d-4e6e-4514-9304-966057dd71bf\" (UID: \"39a2a67d-4e6e-4514-9304-966057dd71bf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.797776 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities" (OuterVolumeSpecName: "utilities") pod "39a2a67d-4e6e-4514-9304-966057dd71bf" (UID: "39a2a67d-4e6e-4514-9304-966057dd71bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.800772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85" (OuterVolumeSpecName: "kube-api-access-25g85") pod "39a2a67d-4e6e-4514-9304-966057dd71bf" (UID: "39a2a67d-4e6e-4514-9304-966057dd71bf"). InnerVolumeSpecName "kube-api-access-25g85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.803367 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.824433 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.825361 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.826037 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.868990 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39a2a67d-4e6e-4514-9304-966057dd71bf" (UID: "39a2a67d-4e6e-4514-9304-966057dd71bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.886299 4830 generic.go:334] "Generic (PLEG): container finished" podID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerID="a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816" exitCode=0 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.886357 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxd8v" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.886359 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerDied","Data":"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.886425 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxd8v" event={"ID":"d80631b4-3fa5-491b-b330-80f733c3b0a4","Type":"ContainerDied","Data":"02aa2320e9aa4982d01f85ab026ee788946e7eb5e2b4ff7ceac0164fc9f54599"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.886451 4830 scope.go:117] "RemoveContainer" containerID="a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.889338 4830 generic.go:334] "Generic (PLEG): container finished" podID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerID="1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db" exitCode=0 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.889387 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvnwt" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.889419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerDied","Data":"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.889438 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvnwt" event={"ID":"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24","Type":"ContainerDied","Data":"e2a3c6cf2ce9e7e6944c0644d1bb0a9fdaedebbf93e43b4519482fd78e31b2a5"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.891776 4830 generic.go:334] "Generic (PLEG): container finished" podID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerID="9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854" exitCode=0 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.891836 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcjzb" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.891831 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerDied","Data":"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.892256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcjzb" event={"ID":"39a2a67d-4e6e-4514-9304-966057dd71bf","Type":"ContainerDied","Data":"5b123390d61460d16a0dc63a3b037169a64f71efe080f887a7487b7cc53690ff"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.895741 4830 generic.go:334] "Generic (PLEG): container finished" podID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerID="4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d" exitCode=0 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.895772 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.895808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerDied","Data":"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.895827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jbhf7" event={"ID":"d4274708-7133-40ca-a10d-e3d2c5fba4cf","Type":"ContainerDied","Data":"39a3b55131bfe8f32af9a43f11b717c42dde81ec31fe34f91871bd17737635d1"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.898339 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities\") pod \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.898397 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities" (OuterVolumeSpecName: "utilities") pod "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" (UID: "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.898442 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content\") pod \"d80631b4-3fa5-491b-b330-80f733c3b0a4\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.898541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content\") pod \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899301 4830 generic.go:334] "Generic (PLEG): container finished" podID="876ff782-f899-41ad-801d-52d31854b34c" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" exitCode=0 Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerDied","Data":"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7zzm" event={"ID":"876ff782-f899-41ad-801d-52d31854b34c","Type":"ContainerDied","Data":"d9b94388d7cee9611a52d035b1f1f4202f32653dc528d84baa4635b866e027fb"} Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899387 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7zzm" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899617 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5x98\" (UniqueName: \"kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98\") pod \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899667 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities\") pod \"d80631b4-3fa5-491b-b330-80f733c3b0a4\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899705 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca\") pod \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899754 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities\") pod \"876ff782-f899-41ad-801d-52d31854b34c\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899794 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhdqc\" (UniqueName: \"kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc\") pod \"d80631b4-3fa5-491b-b330-80f733c3b0a4\" (UID: \"d80631b4-3fa5-491b-b330-80f733c3b0a4\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.899833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content\") pod \"876ff782-f899-41ad-801d-52d31854b34c\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.902002 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities" (OuterVolumeSpecName: "utilities") pod "876ff782-f899-41ad-801d-52d31854b34c" (UID: "876ff782-f899-41ad-801d-52d31854b34c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.903562 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b65s\" (UniqueName: \"kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s\") pod \"876ff782-f899-41ad-801d-52d31854b34c\" (UID: \"876ff782-f899-41ad-801d-52d31854b34c\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.903698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrlmz\" (UniqueName: \"kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz\") pod \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\" (UID: \"d3ffcb90-9016-4c43-8b6c-9452e9cf6e24\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.903795 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics\") pod \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\" (UID: \"d4274708-7133-40ca-a10d-e3d2c5fba4cf\") " Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.903831 4830 scope.go:117] "RemoveContainer" containerID="a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.905238 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98" (OuterVolumeSpecName: "kube-api-access-g5x98") pod "d4274708-7133-40ca-a10d-e3d2c5fba4cf" (UID: "d4274708-7133-40ca-a10d-e3d2c5fba4cf"). InnerVolumeSpecName "kube-api-access-g5x98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.906649 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc" (OuterVolumeSpecName: "kube-api-access-xhdqc") pod "d80631b4-3fa5-491b-b330-80f733c3b0a4" (UID: "d80631b4-3fa5-491b-b330-80f733c3b0a4"). InnerVolumeSpecName "kube-api-access-xhdqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.907361 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.907456 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.907471 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39a2a67d-4e6e-4514-9304-966057dd71bf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.907535 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25g85\" (UniqueName: \"kubernetes.io/projected/39a2a67d-4e6e-4514-9304-966057dd71bf-kube-api-access-25g85\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.907683 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.908344 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities" (OuterVolumeSpecName: "utilities") pod "d80631b4-3fa5-491b-b330-80f733c3b0a4" (UID: "d80631b4-3fa5-491b-b330-80f733c3b0a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.910088 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d4274708-7133-40ca-a10d-e3d2c5fba4cf" (UID: "d4274708-7133-40ca-a10d-e3d2c5fba4cf"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.910482 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s" (OuterVolumeSpecName: "kube-api-access-4b65s") pod "876ff782-f899-41ad-801d-52d31854b34c" (UID: "876ff782-f899-41ad-801d-52d31854b34c"). InnerVolumeSpecName "kube-api-access-4b65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.912149 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz" (OuterVolumeSpecName: "kube-api-access-rrlmz") pod "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" (UID: "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24"). InnerVolumeSpecName "kube-api-access-rrlmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.920134 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d4274708-7133-40ca-a10d-e3d2c5fba4cf" (UID: "d4274708-7133-40ca-a10d-e3d2c5fba4cf"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.926685 4830 scope.go:117] "RemoveContainer" containerID="ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.947226 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "876ff782-f899-41ad-801d-52d31854b34c" (UID: "876ff782-f899-41ad-801d-52d31854b34c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.952669 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.953405 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mcjzb"] Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.953717 4830 scope.go:117] "RemoveContainer" containerID="a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816" Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.954166 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816\": container with ID starting with a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816 not found: ID does not exist" containerID="a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954212 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816"} err="failed to get container status \"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816\": rpc error: code = NotFound desc = could not find container \"a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816\": container with ID starting with a1749b85ab3dcb11a53760f5034704cd5bcc998f334e506e186fe5258b617816 not found: ID does not exist" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954242 4830 scope.go:117] "RemoveContainer" containerID="a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856" Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.954594 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856\": container with ID starting with a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856 not found: ID does not exist" containerID="a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954615 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856"} err="failed to get container status \"a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856\": rpc error: code = NotFound desc = could not find container \"a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856\": container with ID starting with a038e3b53efe8b416898d247cc0e47b7c93f3fee8e11ea19118f64869624c856 not found: ID does not exist" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954627 4830 scope.go:117] "RemoveContainer" containerID="ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a" Dec 03 22:11:18 crc kubenswrapper[4830]: E1203 22:11:18.954848 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a\": container with ID starting with ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a not found: ID does not exist" containerID="ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954872 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a"} err="failed to get container status \"ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a\": rpc error: code = NotFound desc = could not find container \"ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a\": container with ID starting with ccde60b69a45f764380653a6601127cbfe68e82601f99300060b036de11b4d2a not found: ID does not exist" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.954884 4830 scope.go:117] "RemoveContainer" containerID="1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.973590 4830 scope.go:117] "RemoveContainer" containerID="5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.989313 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d80631b4-3fa5-491b-b330-80f733c3b0a4" (UID: "d80631b4-3fa5-491b-b330-80f733c3b0a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:18 crc kubenswrapper[4830]: I1203 22:11:18.999606 4830 scope.go:117] "RemoveContainer" containerID="36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.008937 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhdqc\" (UniqueName: \"kubernetes.io/projected/d80631b4-3fa5-491b-b330-80f733c3b0a4-kube-api-access-xhdqc\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.008967 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/876ff782-f899-41ad-801d-52d31854b34c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.008979 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b65s\" (UniqueName: \"kubernetes.io/projected/876ff782-f899-41ad-801d-52d31854b34c-kube-api-access-4b65s\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.008991 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrlmz\" (UniqueName: \"kubernetes.io/projected/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-kube-api-access-rrlmz\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.009018 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.009031 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.009042 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5x98\" (UniqueName: \"kubernetes.io/projected/d4274708-7133-40ca-a10d-e3d2c5fba4cf-kube-api-access-g5x98\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.009053 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d80631b4-3fa5-491b-b330-80f733c3b0a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.009066 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4274708-7133-40ca-a10d-e3d2c5fba4cf-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.011944 4830 scope.go:117] "RemoveContainer" containerID="1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.012365 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db\": container with ID starting with 1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db not found: ID does not exist" containerID="1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.012408 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db"} err="failed to get container status \"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db\": rpc error: code = NotFound desc = could not find container \"1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db\": container with ID starting with 1dd65756ffeacb0a631873b3796ff3ee83a61866c435069e95e2da02874db2db not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.012437 4830 scope.go:117] "RemoveContainer" containerID="5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.013071 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97\": container with ID starting with 5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97 not found: ID does not exist" containerID="5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.013113 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97"} err="failed to get container status \"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97\": rpc error: code = NotFound desc = could not find container \"5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97\": container with ID starting with 5e82aec7c70dd98a12046eadc2e57785a8ae21899798ef9779628894a75bca97 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.013148 4830 scope.go:117] "RemoveContainer" containerID="36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.013411 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1\": container with ID starting with 36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1 not found: ID does not exist" containerID="36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.013438 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1"} err="failed to get container status \"36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1\": rpc error: code = NotFound desc = could not find container \"36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1\": container with ID starting with 36919664608f59d2f26565337502d5c1050f3f9db8dac4ecf5ea52dc9d3cd7f1 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.013454 4830 scope.go:117] "RemoveContainer" containerID="9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.025261 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" (UID: "d3ffcb90-9016-4c43-8b6c-9452e9cf6e24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.026770 4830 scope.go:117] "RemoveContainer" containerID="b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.040747 4830 scope.go:117] "RemoveContainer" containerID="3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.053155 4830 scope.go:117] "RemoveContainer" containerID="9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.053624 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854\": container with ID starting with 9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854 not found: ID does not exist" containerID="9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.053661 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854"} err="failed to get container status \"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854\": rpc error: code = NotFound desc = could not find container \"9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854\": container with ID starting with 9973d5b16432aca99095a7ea55c3d50808c09d3f8347a830e8c05f7b1b31f854 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.053700 4830 scope.go:117] "RemoveContainer" containerID="b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.053963 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426\": container with ID starting with b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426 not found: ID does not exist" containerID="b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.053993 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426"} err="failed to get container status \"b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426\": rpc error: code = NotFound desc = could not find container \"b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426\": container with ID starting with b4a53776de3f7be43d6112ca962e2c7a9e9c4c7a3e1e0d78109de1803304c426 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.054016 4830 scope.go:117] "RemoveContainer" containerID="3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.054245 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4\": container with ID starting with 3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4 not found: ID does not exist" containerID="3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.054275 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4"} err="failed to get container status \"3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4\": rpc error: code = NotFound desc = could not find container \"3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4\": container with ID starting with 3483b185fde367ef6debf2e897dc08caf379f51e7d72f9ad631a878735847ca4 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.054302 4830 scope.go:117] "RemoveContainer" containerID="4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.065520 4830 scope.go:117] "RemoveContainer" containerID="fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.078604 4830 scope.go:117] "RemoveContainer" containerID="4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.079072 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d\": container with ID starting with 4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d not found: ID does not exist" containerID="4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.079123 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d"} err="failed to get container status \"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d\": rpc error: code = NotFound desc = could not find container \"4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d\": container with ID starting with 4da68244b126612a7a53a983c9844d88ae20c1d696fd3f9771980cb2d8b12a8d not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.079178 4830 scope.go:117] "RemoveContainer" containerID="fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.079862 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9\": container with ID starting with fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9 not found: ID does not exist" containerID="fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.079898 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9"} err="failed to get container status \"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9\": rpc error: code = NotFound desc = could not find container \"fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9\": container with ID starting with fbfeed6b7655e6affea17206ed123607324e9f1465bbe28ea7a651ea066977e9 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.079921 4830 scope.go:117] "RemoveContainer" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.098130 4830 scope.go:117] "RemoveContainer" containerID="901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.100624 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7s8ph"] Dec 03 22:11:19 crc kubenswrapper[4830]: W1203 22:11:19.104009 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96069a0e_4ce1_4f68_835c_0a0110f36b2c.slice/crio-d62d039b3f3b659236b4b36c2faa88b5329c6a3ed02207f4647cbc9df2f3f665 WatchSource:0}: Error finding container d62d039b3f3b659236b4b36c2faa88b5329c6a3ed02207f4647cbc9df2f3f665: Status 404 returned error can't find the container with id d62d039b3f3b659236b4b36c2faa88b5329c6a3ed02207f4647cbc9df2f3f665 Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.109958 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.116763 4830 scope.go:117] "RemoveContainer" containerID="b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.132271 4830 scope.go:117] "RemoveContainer" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.132757 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c\": container with ID starting with c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c not found: ID does not exist" containerID="c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.132821 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c"} err="failed to get container status \"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c\": rpc error: code = NotFound desc = could not find container \"c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c\": container with ID starting with c0b2b767d3f69dc0c14bffcadfa9c759e5930194c63442ca70c81b271c265a6c not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.132864 4830 scope.go:117] "RemoveContainer" containerID="901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.133169 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf\": container with ID starting with 901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf not found: ID does not exist" containerID="901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.133202 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf"} err="failed to get container status \"901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf\": rpc error: code = NotFound desc = could not find container \"901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf\": container with ID starting with 901c475b99a3e9d20b92a0c344dc4bae26be276a2e43edca00ee4230867cbddf not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.133225 4830 scope.go:117] "RemoveContainer" containerID="b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242" Dec 03 22:11:19 crc kubenswrapper[4830]: E1203 22:11:19.133594 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242\": container with ID starting with b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242 not found: ID does not exist" containerID="b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.133638 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242"} err="failed to get container status \"b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242\": rpc error: code = NotFound desc = could not find container \"b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242\": container with ID starting with b49daa807c4cdb5448ee81d8ad5c0c40d1c24b3f36884f4522b9559d6ec3a242 not found: ID does not exist" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.230465 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.236584 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nvnwt"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.247566 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.252011 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gxd8v"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.261442 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.264153 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jbhf7"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.278122 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.283266 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7zzm"] Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.341967 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" path="/var/lib/kubelet/pods/39a2a67d-4e6e-4514-9304-966057dd71bf/volumes" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.342586 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876ff782-f899-41ad-801d-52d31854b34c" path="/var/lib/kubelet/pods/876ff782-f899-41ad-801d-52d31854b34c/volumes" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.343117 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" path="/var/lib/kubelet/pods/d3ffcb90-9016-4c43-8b6c-9452e9cf6e24/volumes" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.344171 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" path="/var/lib/kubelet/pods/d4274708-7133-40ca-a10d-e3d2c5fba4cf/volumes" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.344876 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" path="/var/lib/kubelet/pods/d80631b4-3fa5-491b-b330-80f733c3b0a4/volumes" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.910708 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" event={"ID":"96069a0e-4ce1-4f68-835c-0a0110f36b2c","Type":"ContainerStarted","Data":"2c66c32881a84e9a3cf397fe3cc2d3f4cfe652ccc943b76fc3e089046ac6889b"} Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.910755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" event={"ID":"96069a0e-4ce1-4f68-835c-0a0110f36b2c","Type":"ContainerStarted","Data":"d62d039b3f3b659236b4b36c2faa88b5329c6a3ed02207f4647cbc9df2f3f665"} Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.911673 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.914808 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" Dec 03 22:11:19 crc kubenswrapper[4830]: I1203 22:11:19.933457 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7s8ph" podStartSLOduration=1.933441514 podStartE2EDuration="1.933441514s" podCreationTimestamp="2025-12-03 22:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:19.932140838 +0000 UTC m=+368.928602197" watchObservedRunningTime="2025-12-03 22:11:19.933441514 +0000 UTC m=+368.929902863" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.035624 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mc5ls"] Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036186 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036205 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036218 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036226 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036240 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036249 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036266 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036274 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036283 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036291 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036303 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036311 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036321 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036333 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036349 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036356 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036367 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036374 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036383 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036390 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="extract-utilities" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036403 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036410 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036422 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036429 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036440 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036447 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="extract-content" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036606 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="876ff782-f899-41ad-801d-52d31854b34c" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036624 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80631b4-3fa5-491b-b330-80f733c3b0a4" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036634 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036643 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a2a67d-4e6e-4514-9304-966057dd71bf" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036653 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036663 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ffcb90-9016-4c43-8b6c-9452e9cf6e24" containerName="registry-server" Dec 03 22:11:20 crc kubenswrapper[4830]: E1203 22:11:20.036767 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.036777 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4274708-7133-40ca-a10d-e3d2c5fba4cf" containerName="marketplace-operator" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.037462 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.040364 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.044405 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mc5ls"] Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.121831 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-catalog-content\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.121885 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltmg\" (UniqueName: \"kubernetes.io/projected/85f85a6a-507e-4744-91ee-1e9471e607c4-kube-api-access-hltmg\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.121922 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-utilities\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.223590 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-catalog-content\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.223960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltmg\" (UniqueName: \"kubernetes.io/projected/85f85a6a-507e-4744-91ee-1e9471e607c4-kube-api-access-hltmg\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.224203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-utilities\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.224199 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-catalog-content\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.224454 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f85a6a-507e-4744-91ee-1e9471e607c4-utilities\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.244862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltmg\" (UniqueName: \"kubernetes.io/projected/85f85a6a-507e-4744-91ee-1e9471e607c4-kube-api-access-hltmg\") pod \"redhat-operators-mc5ls\" (UID: \"85f85a6a-507e-4744-91ee-1e9471e607c4\") " pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.362789 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.782081 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mc5ls"] Dec 03 22:11:20 crc kubenswrapper[4830]: I1203 22:11:20.918724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mc5ls" event={"ID":"85f85a6a-507e-4744-91ee-1e9471e607c4","Type":"ContainerStarted","Data":"21bd154860021025ec8af1de8746795eb4935595c6ee8d09ec4c3bcb2a41547d"} Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.837243 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tf9xv"] Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.838715 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.841108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.842916 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf9xv"] Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.925318 4830 generic.go:334] "Generic (PLEG): container finished" podID="85f85a6a-507e-4744-91ee-1e9471e607c4" containerID="774efaca9ae68a9fac0852c061243f72384700aaa7e25568cfb7ff3bcc46de0e" exitCode=0 Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.926516 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mc5ls" event={"ID":"85f85a6a-507e-4744-91ee-1e9471e607c4","Type":"ContainerDied","Data":"774efaca9ae68a9fac0852c061243f72384700aaa7e25568cfb7ff3bcc46de0e"} Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.953478 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-catalog-content\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.953605 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnd2\" (UniqueName: \"kubernetes.io/projected/a883c6ca-b81f-4954-9606-552fb6ee7b29-kube-api-access-pwnd2\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:21 crc kubenswrapper[4830]: I1203 22:11:21.953640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-utilities\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.054468 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-catalog-content\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.054644 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnd2\" (UniqueName: \"kubernetes.io/projected/a883c6ca-b81f-4954-9606-552fb6ee7b29-kube-api-access-pwnd2\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.054703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-utilities\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.055059 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-catalog-content\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.055267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a883c6ca-b81f-4954-9606-552fb6ee7b29-utilities\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.076597 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnd2\" (UniqueName: \"kubernetes.io/projected/a883c6ca-b81f-4954-9606-552fb6ee7b29-kube-api-access-pwnd2\") pod \"certified-operators-tf9xv\" (UID: \"a883c6ca-b81f-4954-9606-552fb6ee7b29\") " pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.167288 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.434267 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.435338 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.436857 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.437894 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.460018 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.460077 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.460103 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnp5j\" (UniqueName: \"kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.561453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.561629 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.561667 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnp5j\" (UniqueName: \"kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.562498 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.562871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.591317 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnp5j\" (UniqueName: \"kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j\") pod \"community-operators-tkbfv\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.620325 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf9xv"] Dec 03 22:11:22 crc kubenswrapper[4830]: W1203 22:11:22.625577 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda883c6ca_b81f_4954_9606_552fb6ee7b29.slice/crio-95fcd6223d59c4d63eaa0cea05a997b5121a474348077566c7785a7e1947fd64 WatchSource:0}: Error finding container 95fcd6223d59c4d63eaa0cea05a997b5121a474348077566c7785a7e1947fd64: Status 404 returned error can't find the container with id 95fcd6223d59c4d63eaa0cea05a997b5121a474348077566c7785a7e1947fd64 Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.757930 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.931820 4830 generic.go:334] "Generic (PLEG): container finished" podID="a883c6ca-b81f-4954-9606-552fb6ee7b29" containerID="66b90d0d67380e8ffab7da77d3842becb9f42dd8d46a6f489b487dcd181f74f1" exitCode=0 Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.931969 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf9xv" event={"ID":"a883c6ca-b81f-4954-9606-552fb6ee7b29","Type":"ContainerDied","Data":"66b90d0d67380e8ffab7da77d3842becb9f42dd8d46a6f489b487dcd181f74f1"} Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.932155 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf9xv" event={"ID":"a883c6ca-b81f-4954-9606-552fb6ee7b29","Type":"ContainerStarted","Data":"95fcd6223d59c4d63eaa0cea05a997b5121a474348077566c7785a7e1947fd64"} Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.933970 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mc5ls" event={"ID":"85f85a6a-507e-4744-91ee-1e9471e607c4","Type":"ContainerStarted","Data":"bdbd95a58d691560aeeecfae82d047817a9d1f2390291540849d4007b9cbea5a"} Dec 03 22:11:22 crc kubenswrapper[4830]: I1203 22:11:22.960072 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 22:11:22 crc kubenswrapper[4830]: W1203 22:11:22.968696 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda87df9c3_5372_4399_877f_b132ae27b408.slice/crio-38ee9bbb514fdd40955938d748ed5780d96805a14e9123877a79d3f8661f209a WatchSource:0}: Error finding container 38ee9bbb514fdd40955938d748ed5780d96805a14e9123877a79d3f8661f209a: Status 404 returned error can't find the container with id 38ee9bbb514fdd40955938d748ed5780d96805a14e9123877a79d3f8661f209a Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.940321 4830 generic.go:334] "Generic (PLEG): container finished" podID="a87df9c3-5372-4399-877f-b132ae27b408" containerID="7428cf4d1066813b915c8018f2d0f437cbd075335f2b71d63115850be3d78561" exitCode=0 Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.940813 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerDied","Data":"7428cf4d1066813b915c8018f2d0f437cbd075335f2b71d63115850be3d78561"} Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.940844 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerStarted","Data":"38ee9bbb514fdd40955938d748ed5780d96805a14e9123877a79d3f8661f209a"} Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.944803 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf9xv" event={"ID":"a883c6ca-b81f-4954-9606-552fb6ee7b29","Type":"ContainerStarted","Data":"018e669be7de33a9cdbb1b5c9db118c280358a965febf9fcfc0b714ea743457b"} Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.948143 4830 generic.go:334] "Generic (PLEG): container finished" podID="85f85a6a-507e-4744-91ee-1e9471e607c4" containerID="bdbd95a58d691560aeeecfae82d047817a9d1f2390291540849d4007b9cbea5a" exitCode=0 Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.948190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mc5ls" event={"ID":"85f85a6a-507e-4744-91ee-1e9471e607c4","Type":"ContainerDied","Data":"bdbd95a58d691560aeeecfae82d047817a9d1f2390291540849d4007b9cbea5a"} Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.950844 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428b"] Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.954911 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.966977 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428b"] Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd1cee12-f958-4240-9367-a405bd4ce9b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982332 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-trusted-ca\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-bound-sa-token\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982393 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p478\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-kube-api-access-9p478\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982445 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd1cee12-f958-4240-9367-a405bd4ce9b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982527 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982554 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-tls\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:23 crc kubenswrapper[4830]: I1203 22:11:23.982575 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-certificates\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.060369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083317 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-trusted-ca\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-bound-sa-token\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p478\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-kube-api-access-9p478\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083449 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd1cee12-f958-4240-9367-a405bd4ce9b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083496 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-tls\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-certificates\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.083596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd1cee12-f958-4240-9367-a405bd4ce9b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.084593 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd1cee12-f958-4240-9367-a405bd4ce9b6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.085215 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-trusted-ca\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.085231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-certificates\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.089976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-registry-tls\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.092072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd1cee12-f958-4240-9367-a405bd4ce9b6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.097929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p478\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-kube-api-access-9p478\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.099108 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd1cee12-f958-4240-9367-a405bd4ce9b6-bound-sa-token\") pod \"image-registry-66df7c8f76-p428b\" (UID: \"fd1cee12-f958-4240-9367-a405bd4ce9b6\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.229942 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7gkr"] Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.230857 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.235166 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.249378 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7gkr"] Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.285205 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-catalog-content\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.285369 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-utilities\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.285412 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwm8\" (UniqueName: \"kubernetes.io/projected/3387e2a9-0110-4c74-adf2-87587a00adf8-kube-api-access-2dwm8\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.313115 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.386605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-utilities\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.386659 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dwm8\" (UniqueName: \"kubernetes.io/projected/3387e2a9-0110-4c74-adf2-87587a00adf8-kube-api-access-2dwm8\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.386711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-catalog-content\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.388044 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-utilities\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.393069 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387e2a9-0110-4c74-adf2-87587a00adf8-catalog-content\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.409579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dwm8\" (UniqueName: \"kubernetes.io/projected/3387e2a9-0110-4c74-adf2-87587a00adf8-kube-api-access-2dwm8\") pod \"redhat-marketplace-m7gkr\" (UID: \"3387e2a9-0110-4c74-adf2-87587a00adf8\") " pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.553309 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.718550 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428b"] Dec 03 22:11:24 crc kubenswrapper[4830]: W1203 22:11:24.729670 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1cee12_f958_4240_9367_a405bd4ce9b6.slice/crio-afedd3e5032a4f1fe91057fe129ddd15c8beac8bc29680b18077a0eff653a9a6 WatchSource:0}: Error finding container afedd3e5032a4f1fe91057fe129ddd15c8beac8bc29680b18077a0eff653a9a6: Status 404 returned error can't find the container with id afedd3e5032a4f1fe91057fe129ddd15c8beac8bc29680b18077a0eff653a9a6 Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.937754 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7gkr"] Dec 03 22:11:24 crc kubenswrapper[4830]: W1203 22:11:24.945815 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3387e2a9_0110_4c74_adf2_87587a00adf8.slice/crio-ff7b6c63d9b193bae7d60be28db671278b16502b65e9bb41084cb72e7f94fac7 WatchSource:0}: Error finding container ff7b6c63d9b193bae7d60be28db671278b16502b65e9bb41084cb72e7f94fac7: Status 404 returned error can't find the container with id ff7b6c63d9b193bae7d60be28db671278b16502b65e9bb41084cb72e7f94fac7 Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.952605 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7gkr" event={"ID":"3387e2a9-0110-4c74-adf2-87587a00adf8","Type":"ContainerStarted","Data":"ff7b6c63d9b193bae7d60be28db671278b16502b65e9bb41084cb72e7f94fac7"} Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.953681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" event={"ID":"fd1cee12-f958-4240-9367-a405bd4ce9b6","Type":"ContainerStarted","Data":"afedd3e5032a4f1fe91057fe129ddd15c8beac8bc29680b18077a0eff653a9a6"} Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.955244 4830 generic.go:334] "Generic (PLEG): container finished" podID="a883c6ca-b81f-4954-9606-552fb6ee7b29" containerID="018e669be7de33a9cdbb1b5c9db118c280358a965febf9fcfc0b714ea743457b" exitCode=0 Dec 03 22:11:24 crc kubenswrapper[4830]: I1203 22:11:24.955282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf9xv" event={"ID":"a883c6ca-b81f-4954-9606-552fb6ee7b29","Type":"ContainerDied","Data":"018e669be7de33a9cdbb1b5c9db118c280358a965febf9fcfc0b714ea743457b"} Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.960180 4830 generic.go:334] "Generic (PLEG): container finished" podID="3387e2a9-0110-4c74-adf2-87587a00adf8" containerID="9c8deb31157731d9e2ad66b0bafea0edaafee98dd9f0c522c9b10983b8d0e76a" exitCode=0 Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.960250 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7gkr" event={"ID":"3387e2a9-0110-4c74-adf2-87587a00adf8","Type":"ContainerDied","Data":"9c8deb31157731d9e2ad66b0bafea0edaafee98dd9f0c522c9b10983b8d0e76a"} Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.966921 4830 generic.go:334] "Generic (PLEG): container finished" podID="a87df9c3-5372-4399-877f-b132ae27b408" containerID="121cc330d07c4a493aac005dedf718fd9573082e57195ac30fa8c0e6fd2cf359" exitCode=0 Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.967001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerDied","Data":"121cc330d07c4a493aac005dedf718fd9573082e57195ac30fa8c0e6fd2cf359"} Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.978314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" event={"ID":"fd1cee12-f958-4240-9367-a405bd4ce9b6","Type":"ContainerStarted","Data":"864d3c6e1488826d1d9b6965e25323bdc62cdaca4bbee7a8d61b70d679d6e85f"} Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.978442 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.986899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf9xv" event={"ID":"a883c6ca-b81f-4954-9606-552fb6ee7b29","Type":"ContainerStarted","Data":"d577811bdd077aee234b38de2cf56196b74d113de64d5e8c184f668878f9d2bf"} Dec 03 22:11:25 crc kubenswrapper[4830]: I1203 22:11:25.991209 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mc5ls" event={"ID":"85f85a6a-507e-4744-91ee-1e9471e607c4","Type":"ContainerStarted","Data":"b0ba352af7fac0ce080efc700de9a8aba635292206cb1db9663cde47861c7183"} Dec 03 22:11:26 crc kubenswrapper[4830]: I1203 22:11:26.008632 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" podStartSLOduration=3.008614265 podStartE2EDuration="3.008614265s" podCreationTimestamp="2025-12-03 22:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:26.003999609 +0000 UTC m=+375.000460978" watchObservedRunningTime="2025-12-03 22:11:26.008614265 +0000 UTC m=+375.005075604" Dec 03 22:11:26 crc kubenswrapper[4830]: I1203 22:11:26.043767 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mc5ls" podStartSLOduration=3.169376066 podStartE2EDuration="6.043747804s" podCreationTimestamp="2025-12-03 22:11:20 +0000 UTC" firstStartedPulling="2025-12-03 22:11:21.927424089 +0000 UTC m=+370.923885448" lastFinishedPulling="2025-12-03 22:11:24.801795837 +0000 UTC m=+373.798257186" observedRunningTime="2025-12-03 22:11:26.039454887 +0000 UTC m=+375.035916256" watchObservedRunningTime="2025-12-03 22:11:26.043747804 +0000 UTC m=+375.040209153" Dec 03 22:11:26 crc kubenswrapper[4830]: I1203 22:11:26.056375 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tf9xv" podStartSLOduration=2.5008989809999997 podStartE2EDuration="5.056357389s" podCreationTimestamp="2025-12-03 22:11:21 +0000 UTC" firstStartedPulling="2025-12-03 22:11:22.941416951 +0000 UTC m=+371.937878300" lastFinishedPulling="2025-12-03 22:11:25.496875359 +0000 UTC m=+374.493336708" observedRunningTime="2025-12-03 22:11:26.055280689 +0000 UTC m=+375.051742048" watchObservedRunningTime="2025-12-03 22:11:26.056357389 +0000 UTC m=+375.052818728" Dec 03 22:11:26 crc kubenswrapper[4830]: I1203 22:11:26.681207 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:11:26 crc kubenswrapper[4830]: I1203 22:11:26.681671 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:11:28 crc kubenswrapper[4830]: I1203 22:11:28.003289 4830 generic.go:334] "Generic (PLEG): container finished" podID="3387e2a9-0110-4c74-adf2-87587a00adf8" containerID="81d34509231c74b3b0755fa671a6f94504703c45762e48c1200e4435a77e045a" exitCode=0 Dec 03 22:11:28 crc kubenswrapper[4830]: I1203 22:11:28.003567 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7gkr" event={"ID":"3387e2a9-0110-4c74-adf2-87587a00adf8","Type":"ContainerDied","Data":"81d34509231c74b3b0755fa671a6f94504703c45762e48c1200e4435a77e045a"} Dec 03 22:11:28 crc kubenswrapper[4830]: I1203 22:11:28.007140 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerStarted","Data":"fe2f0c4ce7981412cab19b2c8cdc51891b19b179689c656709298438b89dcdf4"} Dec 03 22:11:28 crc kubenswrapper[4830]: I1203 22:11:28.044892 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tkbfv" podStartSLOduration=3.618733567 podStartE2EDuration="6.044875084s" podCreationTimestamp="2025-12-03 22:11:22 +0000 UTC" firstStartedPulling="2025-12-03 22:11:23.942686425 +0000 UTC m=+372.939147774" lastFinishedPulling="2025-12-03 22:11:26.368827942 +0000 UTC m=+375.365289291" observedRunningTime="2025-12-03 22:11:28.041986325 +0000 UTC m=+377.038447674" watchObservedRunningTime="2025-12-03 22:11:28.044875084 +0000 UTC m=+377.041336423" Dec 03 22:11:29 crc kubenswrapper[4830]: I1203 22:11:29.013258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7gkr" event={"ID":"3387e2a9-0110-4c74-adf2-87587a00adf8","Type":"ContainerStarted","Data":"3c9457e4f901d979ee67f7092438c1e9098eaa043d319450da3d109ad7c25bed"} Dec 03 22:11:29 crc kubenswrapper[4830]: I1203 22:11:29.033501 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7gkr" podStartSLOduration=2.2043698 podStartE2EDuration="5.033485952s" podCreationTimestamp="2025-12-03 22:11:24 +0000 UTC" firstStartedPulling="2025-12-03 22:11:25.963616855 +0000 UTC m=+374.960078204" lastFinishedPulling="2025-12-03 22:11:28.792733017 +0000 UTC m=+377.789194356" observedRunningTime="2025-12-03 22:11:29.028876187 +0000 UTC m=+378.025337536" watchObservedRunningTime="2025-12-03 22:11:29.033485952 +0000 UTC m=+378.029947291" Dec 03 22:11:30 crc kubenswrapper[4830]: I1203 22:11:30.362942 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:30 crc kubenswrapper[4830]: I1203 22:11:30.363294 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:30 crc kubenswrapper[4830]: I1203 22:11:30.431432 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:31 crc kubenswrapper[4830]: I1203 22:11:31.062159 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mc5ls" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.168299 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.168865 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.219983 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.760035 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.760124 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:32 crc kubenswrapper[4830]: I1203 22:11:32.811422 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:33 crc kubenswrapper[4830]: I1203 22:11:33.072612 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 22:11:33 crc kubenswrapper[4830]: I1203 22:11:33.101219 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tf9xv" Dec 03 22:11:34 crc kubenswrapper[4830]: I1203 22:11:34.554099 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:34 crc kubenswrapper[4830]: I1203 22:11:34.554452 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:34 crc kubenswrapper[4830]: I1203 22:11:34.601438 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:35 crc kubenswrapper[4830]: I1203 22:11:35.090748 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7gkr" Dec 03 22:11:44 crc kubenswrapper[4830]: I1203 22:11:44.322680 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p428b" Dec 03 22:11:44 crc kubenswrapper[4830]: I1203 22:11:44.386006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:11:56 crc kubenswrapper[4830]: I1203 22:11:56.681350 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:11:56 crc kubenswrapper[4830]: I1203 22:11:56.682066 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:12:09 crc kubenswrapper[4830]: I1203 22:12:09.434436 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" podUID="2fbc6d25-674b-4886-9c9e-40971da8de89" containerName="registry" containerID="cri-o://52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c" gracePeriod=30 Dec 03 22:12:09 crc kubenswrapper[4830]: I1203 22:12:09.994108 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193125 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193203 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193242 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sg2j\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193279 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193420 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193463 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193487 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.193550 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates\") pod \"2fbc6d25-674b-4886-9c9e-40971da8de89\" (UID: \"2fbc6d25-674b-4886-9c9e-40971da8de89\") " Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.194258 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.194373 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.200119 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.200112 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.200431 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.202976 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j" (OuterVolumeSpecName: "kube-api-access-8sg2j") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "kube-api-access-8sg2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.205151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.214880 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2fbc6d25-674b-4886-9c9e-40971da8de89" (UID: "2fbc6d25-674b-4886-9c9e-40971da8de89"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.288233 4830 generic.go:334] "Generic (PLEG): container finished" podID="2fbc6d25-674b-4886-9c9e-40971da8de89" containerID="52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c" exitCode=0 Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.288311 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.288338 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" event={"ID":"2fbc6d25-674b-4886-9c9e-40971da8de89","Type":"ContainerDied","Data":"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c"} Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.288962 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qf7rn" event={"ID":"2fbc6d25-674b-4886-9c9e-40971da8de89","Type":"ContainerDied","Data":"65dc0ecb298639f91d4edb05b54bfcef89d0936609ad389318aeb987daa205be"} Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.289043 4830 scope.go:117] "RemoveContainer" containerID="52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.294372 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sg2j\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-kube-api-access-8sg2j\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.294541 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fbc6d25-674b-4886-9c9e-40971da8de89-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.294674 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.294814 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.294941 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.295040 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fbc6d25-674b-4886-9c9e-40971da8de89-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.295122 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fbc6d25-674b-4886-9c9e-40971da8de89-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.316805 4830 scope.go:117] "RemoveContainer" containerID="52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c" Dec 03 22:12:10 crc kubenswrapper[4830]: E1203 22:12:10.317292 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c\": container with ID starting with 52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c not found: ID does not exist" containerID="52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.317341 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c"} err="failed to get container status \"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c\": rpc error: code = NotFound desc = could not find container \"52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c\": container with ID starting with 52453121313bf3e5ecc36c182e652d059a32693368089bfc40049e548aa6370c not found: ID does not exist" Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.335343 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:12:10 crc kubenswrapper[4830]: I1203 22:12:10.339874 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qf7rn"] Dec 03 22:12:11 crc kubenswrapper[4830]: I1203 22:12:11.348396 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fbc6d25-674b-4886-9c9e-40971da8de89" path="/var/lib/kubelet/pods/2fbc6d25-674b-4886-9c9e-40971da8de89/volumes" Dec 03 22:12:26 crc kubenswrapper[4830]: I1203 22:12:26.681184 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:12:26 crc kubenswrapper[4830]: I1203 22:12:26.682186 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:12:26 crc kubenswrapper[4830]: I1203 22:12:26.682277 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:12:26 crc kubenswrapper[4830]: I1203 22:12:26.683475 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:12:26 crc kubenswrapper[4830]: I1203 22:12:26.683619 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6" gracePeriod=600 Dec 03 22:12:27 crc kubenswrapper[4830]: I1203 22:12:27.397684 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6" exitCode=0 Dec 03 22:12:27 crc kubenswrapper[4830]: I1203 22:12:27.397921 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6"} Dec 03 22:12:27 crc kubenswrapper[4830]: I1203 22:12:27.398628 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb"} Dec 03 22:12:27 crc kubenswrapper[4830]: I1203 22:12:27.398728 4830 scope.go:117] "RemoveContainer" containerID="d384acee36d352984805a1fbebe07735a2cccefaaedfc389a65a023cd6463f49" Dec 03 22:14:11 crc kubenswrapper[4830]: I1203 22:14:11.603255 4830 scope.go:117] "RemoveContainer" containerID="abec3c7aab5040ef27c81bba8b6b8cb63594e64d72337a86748fa59dc16e7c9d" Dec 03 22:14:26 crc kubenswrapper[4830]: I1203 22:14:26.681269 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:14:26 crc kubenswrapper[4830]: I1203 22:14:26.681685 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:14:56 crc kubenswrapper[4830]: I1203 22:14:56.681399 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:14:56 crc kubenswrapper[4830]: I1203 22:14:56.682237 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.199091 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl"] Dec 03 22:15:00 crc kubenswrapper[4830]: E1203 22:15:00.199839 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbc6d25-674b-4886-9c9e-40971da8de89" containerName="registry" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.199862 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbc6d25-674b-4886-9c9e-40971da8de89" containerName="registry" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.200053 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbc6d25-674b-4886-9c9e-40971da8de89" containerName="registry" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.200619 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.202690 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.205984 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.212547 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl"] Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.357094 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.357177 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.357212 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54skm\" (UniqueName: \"kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.459722 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.459797 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.459881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54skm\" (UniqueName: \"kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.461244 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.470266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.489382 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54skm\" (UniqueName: \"kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm\") pod \"collect-profiles-29413335-59kgl\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.528538 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:00 crc kubenswrapper[4830]: I1203 22:15:00.720383 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl"] Dec 03 22:15:01 crc kubenswrapper[4830]: I1203 22:15:01.430663 4830 generic.go:334] "Generic (PLEG): container finished" podID="4dd1ff4c-8898-4d90-8f38-74e7c94f57da" containerID="f2f69b519ed5a3184685e59e69cc6b6537c08f223ce6b678946d0cbd4da13ff3" exitCode=0 Dec 03 22:15:01 crc kubenswrapper[4830]: I1203 22:15:01.430723 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" event={"ID":"4dd1ff4c-8898-4d90-8f38-74e7c94f57da","Type":"ContainerDied","Data":"f2f69b519ed5a3184685e59e69cc6b6537c08f223ce6b678946d0cbd4da13ff3"} Dec 03 22:15:01 crc kubenswrapper[4830]: I1203 22:15:01.430764 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" event={"ID":"4dd1ff4c-8898-4d90-8f38-74e7c94f57da","Type":"ContainerStarted","Data":"58e212249de40a4dc3846e3d28218c3cd6fbd5c275bb8670692a0bfda22311fc"} Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.694419 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.793143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume\") pod \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.793282 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54skm\" (UniqueName: \"kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm\") pod \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.793348 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume\") pod \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\" (UID: \"4dd1ff4c-8898-4d90-8f38-74e7c94f57da\") " Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.795152 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume" (OuterVolumeSpecName: "config-volume") pod "4dd1ff4c-8898-4d90-8f38-74e7c94f57da" (UID: "4dd1ff4c-8898-4d90-8f38-74e7c94f57da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.800794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm" (OuterVolumeSpecName: "kube-api-access-54skm") pod "4dd1ff4c-8898-4d90-8f38-74e7c94f57da" (UID: "4dd1ff4c-8898-4d90-8f38-74e7c94f57da"). InnerVolumeSpecName "kube-api-access-54skm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.801881 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4dd1ff4c-8898-4d90-8f38-74e7c94f57da" (UID: "4dd1ff4c-8898-4d90-8f38-74e7c94f57da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.894971 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.895024 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54skm\" (UniqueName: \"kubernetes.io/projected/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-kube-api-access-54skm\") on node \"crc\" DevicePath \"\"" Dec 03 22:15:02 crc kubenswrapper[4830]: I1203 22:15:02.895045 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4dd1ff4c-8898-4d90-8f38-74e7c94f57da-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:15:03 crc kubenswrapper[4830]: I1203 22:15:03.442709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" event={"ID":"4dd1ff4c-8898-4d90-8f38-74e7c94f57da","Type":"ContainerDied","Data":"58e212249de40a4dc3846e3d28218c3cd6fbd5c275bb8670692a0bfda22311fc"} Dec 03 22:15:03 crc kubenswrapper[4830]: I1203 22:15:03.442777 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e212249de40a4dc3846e3d28218c3cd6fbd5c275bb8670692a0bfda22311fc" Dec 03 22:15:03 crc kubenswrapper[4830]: I1203 22:15:03.442806 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl" Dec 03 22:15:26 crc kubenswrapper[4830]: I1203 22:15:26.681605 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:15:26 crc kubenswrapper[4830]: I1203 22:15:26.682391 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:15:26 crc kubenswrapper[4830]: I1203 22:15:26.682465 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:15:26 crc kubenswrapper[4830]: I1203 22:15:26.683458 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:15:26 crc kubenswrapper[4830]: I1203 22:15:26.683599 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb" gracePeriod=600 Dec 03 22:15:27 crc kubenswrapper[4830]: I1203 22:15:27.621361 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb" exitCode=0 Dec 03 22:15:27 crc kubenswrapper[4830]: I1203 22:15:27.621433 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb"} Dec 03 22:15:27 crc kubenswrapper[4830]: I1203 22:15:27.621873 4830 scope.go:117] "RemoveContainer" containerID="152ffa1ead369b08b371ccd992972b35f88c027ff3c25c22eaf22bfaf6d442f6" Dec 03 22:15:27 crc kubenswrapper[4830]: I1203 22:15:27.621759 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b"} Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.104957 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht"] Dec 03 22:16:44 crc kubenswrapper[4830]: E1203 22:16:44.106188 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd1ff4c-8898-4d90-8f38-74e7c94f57da" containerName="collect-profiles" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.106219 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd1ff4c-8898-4d90-8f38-74e7c94f57da" containerName="collect-profiles" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.106458 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd1ff4c-8898-4d90-8f38-74e7c94f57da" containerName="collect-profiles" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.112250 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.119274 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.120633 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht"] Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.180490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.180562 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh9xn\" (UniqueName: \"kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.180700 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.282082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.282147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh9xn\" (UniqueName: \"kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.282169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.282755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.283070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.310896 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh9xn\" (UniqueName: \"kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.440151 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:44 crc kubenswrapper[4830]: I1203 22:16:44.868055 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht"] Dec 03 22:16:44 crc kubenswrapper[4830]: W1203 22:16:44.880032 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc36678_47ec_4ef3_bba7_a1ddec002156.slice/crio-60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d WatchSource:0}: Error finding container 60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d: Status 404 returned error can't find the container with id 60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d Dec 03 22:16:45 crc kubenswrapper[4830]: I1203 22:16:45.115670 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerStarted","Data":"c824ec9f1eeed3ab606fd7e875dadf50578212c2e87b5ba331eee255e0a738c2"} Dec 03 22:16:45 crc kubenswrapper[4830]: I1203 22:16:45.115721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerStarted","Data":"60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d"} Dec 03 22:16:46 crc kubenswrapper[4830]: I1203 22:16:46.124692 4830 generic.go:334] "Generic (PLEG): container finished" podID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerID="c824ec9f1eeed3ab606fd7e875dadf50578212c2e87b5ba331eee255e0a738c2" exitCode=0 Dec 03 22:16:46 crc kubenswrapper[4830]: I1203 22:16:46.124839 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerDied","Data":"c824ec9f1eeed3ab606fd7e875dadf50578212c2e87b5ba331eee255e0a738c2"} Dec 03 22:16:46 crc kubenswrapper[4830]: I1203 22:16:46.126870 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:16:47 crc kubenswrapper[4830]: I1203 22:16:47.133993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerStarted","Data":"584c3489b9a933cd7f36b0bac051640ee5293da179f54c2fd5621277e057774e"} Dec 03 22:16:48 crc kubenswrapper[4830]: I1203 22:16:48.142978 4830 generic.go:334] "Generic (PLEG): container finished" podID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerID="584c3489b9a933cd7f36b0bac051640ee5293da179f54c2fd5621277e057774e" exitCode=0 Dec 03 22:16:48 crc kubenswrapper[4830]: I1203 22:16:48.143044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerDied","Data":"584c3489b9a933cd7f36b0bac051640ee5293da179f54c2fd5621277e057774e"} Dec 03 22:16:49 crc kubenswrapper[4830]: I1203 22:16:49.150870 4830 generic.go:334] "Generic (PLEG): container finished" podID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerID="4319780fd91e39a7a3843a0518284eca0943257ec1247fec5c14c91e52f50232" exitCode=0 Dec 03 22:16:49 crc kubenswrapper[4830]: I1203 22:16:49.150976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerDied","Data":"4319780fd91e39a7a3843a0518284eca0943257ec1247fec5c14c91e52f50232"} Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.408800 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.565993 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util\") pod \"dfc36678-47ec-4ef3-bba7-a1ddec002156\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.566233 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh9xn\" (UniqueName: \"kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn\") pod \"dfc36678-47ec-4ef3-bba7-a1ddec002156\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.566277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle\") pod \"dfc36678-47ec-4ef3-bba7-a1ddec002156\" (UID: \"dfc36678-47ec-4ef3-bba7-a1ddec002156\") " Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.569635 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle" (OuterVolumeSpecName: "bundle") pod "dfc36678-47ec-4ef3-bba7-a1ddec002156" (UID: "dfc36678-47ec-4ef3-bba7-a1ddec002156"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.579024 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn" (OuterVolumeSpecName: "kube-api-access-qh9xn") pod "dfc36678-47ec-4ef3-bba7-a1ddec002156" (UID: "dfc36678-47ec-4ef3-bba7-a1ddec002156"). InnerVolumeSpecName "kube-api-access-qh9xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.582418 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util" (OuterVolumeSpecName: "util") pod "dfc36678-47ec-4ef3-bba7-a1ddec002156" (UID: "dfc36678-47ec-4ef3-bba7-a1ddec002156"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.667645 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh9xn\" (UniqueName: \"kubernetes.io/projected/dfc36678-47ec-4ef3-bba7-a1ddec002156-kube-api-access-qh9xn\") on node \"crc\" DevicePath \"\"" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.667682 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:16:50 crc kubenswrapper[4830]: I1203 22:16:50.667694 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfc36678-47ec-4ef3-bba7-a1ddec002156-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:16:51 crc kubenswrapper[4830]: I1203 22:16:51.175046 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" event={"ID":"dfc36678-47ec-4ef3-bba7-a1ddec002156","Type":"ContainerDied","Data":"60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d"} Dec 03 22:16:51 crc kubenswrapper[4830]: I1203 22:16:51.175121 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a9a046dccec39666973a1613f1dd13f65e75518609fa282b726c3ac420ec6d" Dec 03 22:16:51 crc kubenswrapper[4830]: I1203 22:16:51.175223 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.004298 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.005009 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="pull" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.005022 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="pull" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.005031 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="util" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.005037 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="util" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.005051 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="extract" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.005057 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="extract" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.005139 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc36678-47ec-4ef3-bba7-a1ddec002156" containerName="extract" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.005498 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.007964 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.008427 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.009569 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-n948t" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.020113 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.098217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvd7\" (UniqueName: \"kubernetes.io/projected/eb61183f-1e00-4056-9cf6-d1503c208d29-kube-api-access-wnvd7\") pod \"obo-prometheus-operator-668cf9dfbb-qzckh\" (UID: \"eb61183f-1e00-4056-9cf6-d1503c208d29\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.131101 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.131893 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.134164 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.142293 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-w8wmf" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.146723 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.147302 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.163321 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.168117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.199288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvd7\" (UniqueName: \"kubernetes.io/projected/eb61183f-1e00-4056-9cf6-d1503c208d29-kube-api-access-wnvd7\") pod \"obo-prometheus-operator-668cf9dfbb-qzckh\" (UID: \"eb61183f-1e00-4056-9cf6-d1503c208d29\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.218049 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvd7\" (UniqueName: \"kubernetes.io/projected/eb61183f-1e00-4056-9cf6-d1503c208d29-kube-api-access-wnvd7\") pod \"obo-prometheus-operator-668cf9dfbb-qzckh\" (UID: \"eb61183f-1e00-4056-9cf6-d1503c208d29\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.300366 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.300437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.300561 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.300619 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.321745 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.330432 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-49j7v"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.331190 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.333303 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.333545 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-bqzdg" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.353452 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-49j7v"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.401752 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.401812 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.401846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.401870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.407634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.410986 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb9e38-36ef-4709-8d72-71e4ca6fa8ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl\" (UID: \"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.438955 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.439319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35bdd835-3ab4-4828-bd70-6d3f0df5131f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt\" (UID: \"35bdd835-3ab4-4828-bd70-6d3f0df5131f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.446215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.459769 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.466711 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zjhqs"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.467361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.472537 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zjhqs"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.478125 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-d2qsp" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.503372 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjzd\" (UniqueName: \"kubernetes.io/projected/a167735b-f973-4627-b731-0d4ab1458916-kube-api-access-psjzd\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.503414 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a167735b-f973-4627-b731-0d4ab1458916-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.604173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-openshift-service-ca\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.604259 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26mr\" (UniqueName: \"kubernetes.io/projected/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-kube-api-access-m26mr\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.604300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjzd\" (UniqueName: \"kubernetes.io/projected/a167735b-f973-4627-b731-0d4ab1458916-kube-api-access-psjzd\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.604320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a167735b-f973-4627-b731-0d4ab1458916-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.608581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a167735b-f973-4627-b731-0d4ab1458916-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632005 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5vgkl"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632353 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-controller" containerID="cri-o://12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632837 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="sbdb" containerID="cri-o://0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632888 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="nbdb" containerID="cri-o://065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632920 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="northd" containerID="cri-o://191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632951 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.632983 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-node" containerID="cri-o://544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.633015 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-acl-logging" containerID="cri-o://ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.645891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjzd\" (UniqueName: \"kubernetes.io/projected/a167735b-f973-4627-b731-0d4ab1458916-kube-api-access-psjzd\") pod \"observability-operator-d8bb48f5d-49j7v\" (UID: \"a167735b-f973-4627-b731-0d4ab1458916\") " pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.680553 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.702924 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" containerID="cri-o://237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" gracePeriod=30 Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.706188 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m26mr\" (UniqueName: \"kubernetes.io/projected/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-kube-api-access-m26mr\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.706258 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-openshift-service-ca\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.707066 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-openshift-service-ca\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.718672 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.729825 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.736626 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26mr\" (UniqueName: \"kubernetes.io/projected/90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85-kube-api-access-m26mr\") pod \"perses-operator-5446b9c989-zjhqs\" (UID: \"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85\") " pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.739866 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0853e476fe51f8b4583f99977c9fac52dc8327f7bba3fead80416991025beaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.739906 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0853e476fe51f8b4583f99977c9fac52dc8327f7bba3fead80416991025beaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.739924 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0853e476fe51f8b4583f99977c9fac52dc8327f7bba3fead80416991025beaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.739960 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0853e476fe51f8b4583f99977c9fac52dc8327f7bba3fead80416991025beaa1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" podUID="a167735b-f973-4627-b731-0d4ab1458916" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.749065 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(2f7238223b0caed8fcb96d2dc6ab0dbff681e70e38f1beb274e4fbeeff35d08a): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.749108 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(2f7238223b0caed8fcb96d2dc6ab0dbff681e70e38f1beb274e4fbeeff35d08a): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.749143 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(2f7238223b0caed8fcb96d2dc6ab0dbff681e70e38f1beb274e4fbeeff35d08a): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.749182 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(2f7238223b0caed8fcb96d2dc6ab0dbff681e70e38f1beb274e4fbeeff35d08a): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" podUID="17fb9e38-36ef-4709-8d72-71e4ca6fa8ad" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.759217 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(f0cad5f005a98d767aeb2073b1a711f92e76d95bd6d0ef38a37bbff3a34b640e): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.759318 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(f0cad5f005a98d767aeb2073b1a711f92e76d95bd6d0ef38a37bbff3a34b640e): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.759382 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(f0cad5f005a98d767aeb2073b1a711f92e76d95bd6d0ef38a37bbff3a34b640e): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.759480 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(f0cad5f005a98d767aeb2073b1a711f92e76d95bd6d0ef38a37bbff3a34b640e): error adding pod openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" podUID="35bdd835-3ab4-4828-bd70-6d3f0df5131f" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.765531 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(ea0d24235aafc7e10474b84d94b49625f23fa56d72339873ab3dbb7d78490c22): error adding pod openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.765573 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(ea0d24235aafc7e10474b84d94b49625f23fa56d72339873ab3dbb7d78490c22): error adding pod openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.765597 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(ea0d24235aafc7e10474b84d94b49625f23fa56d72339873ab3dbb7d78490c22): error adding pod openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.765651 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(ea0d24235aafc7e10474b84d94b49625f23fa56d72339873ab3dbb7d78490c22): error adding pod openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" podUID="eb61183f-1e00-4056-9cf6-d1503c208d29" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.779225 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.779468 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.786781 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.786852 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="nbdb" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.787305 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.787335 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="sbdb" Dec 03 22:17:01 crc kubenswrapper[4830]: I1203 22:17:01.800362 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.869854 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(0f636d69c24f867a85742a19ef16ebc35a174f88aa58fd177ee9923591efa85e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.869915 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(0f636d69c24f867a85742a19ef16ebc35a174f88aa58fd177ee9923591efa85e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.869937 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(0f636d69c24f867a85742a19ef16ebc35a174f88aa58fd177ee9923591efa85e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:01 crc kubenswrapper[4830]: E1203 22:17:01.869982 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(0f636d69c24f867a85742a19ef16ebc35a174f88aa58fd177ee9923591efa85e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" podUID="90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.097321 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/3.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.101763 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovn-acl-logging/0.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.102352 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovn-controller/0.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.102854 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174486 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5w22q"] Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174821 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174843 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174852 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174862 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174877 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174884 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174894 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="nbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174901 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="nbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174912 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="sbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174918 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="sbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174925 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174932 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174939 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-node" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174946 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-node" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174955 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kubecfg-setup" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174961 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kubecfg-setup" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174970 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-acl-logging" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174977 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-acl-logging" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174985 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="northd" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.174992 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="northd" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.174999 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175005 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175214 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175226 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="sbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175234 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175244 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175253 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="nbdb" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175259 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175266 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="northd" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175279 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175286 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="kube-rbac-proxy-node" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175293 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175300 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovn-acl-logging" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.175390 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175397 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.175405 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175411 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.175528 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerName="ovnkube-controller" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.177432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.211937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.211998 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212022 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212040 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212063 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212112 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sktrh\" (UniqueName: \"kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212149 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212181 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212206 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212194 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212230 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212254 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212303 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212331 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212354 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212388 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212455 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212488 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212562 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212602 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212631 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash" (OuterVolumeSpecName: "host-slash") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212523 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212745 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin\") pod \"44a18320-6162-4fc5-a89c-363c4c6cd030\" (UID: \"44a18320-6162-4fc5-a89c-363c4c6cd030\") " Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212810 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212883 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212904 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log" (OuterVolumeSpecName: "node-log") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212921 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket" (OuterVolumeSpecName: "log-socket") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212967 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.212995 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213018 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213081 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213269 4830 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213300 4830 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213310 4830 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213318 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213326 4830 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213333 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213342 4830 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213349 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213357 4830 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213380 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213389 4830 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213397 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213405 4830 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213414 4830 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213423 4830 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213432 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.213453 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a18320-6162-4fc5-a89c-363c4c6cd030-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.231917 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.232177 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh" (OuterVolumeSpecName: "kube-api-access-sktrh") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "kube-api-access-sktrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.232879 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "44a18320-6162-4fc5-a89c-363c4c6cd030" (UID: "44a18320-6162-4fc5-a89c-363c4c6cd030"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.247020 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovnkube-controller/3.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.249374 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovn-acl-logging/0.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250029 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vgkl_44a18320-6162-4fc5-a89c-363c4c6cd030/ovn-controller/0.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250468 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250501 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250529 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250538 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250533 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250547 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250596 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250608 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250610 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" exitCode=0 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250619 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250625 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" exitCode=143 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250630 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250635 4830 generic.go:334] "Generic (PLEG): container finished" podID="44a18320-6162-4fc5-a89c-363c4c6cd030" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" exitCode=143 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250642 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250648 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250654 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250583 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250741 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250864 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250871 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250880 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250887 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250894 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250901 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250938 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250960 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250969 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250976 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250983 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250989 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.250996 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251003 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251009 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251017 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251023 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251226 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251262 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251272 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251279 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251286 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251292 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251299 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251305 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251312 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251318 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251324 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vgkl" event={"ID":"44a18320-6162-4fc5-a89c-363c4c6cd030","Type":"ContainerDied","Data":"817569cb6e408227a7f1ee953f842f268ceb2eca9a84f6b3fa05f6e9d943f9d4"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251700 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251749 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251757 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251763 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251771 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251777 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251784 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251791 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251800 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.251807 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.252460 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/2.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.252901 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/1.log" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.252935 4830 generic.go:334] "Generic (PLEG): container finished" podID="bdccedf8-f580-49f0-848e-108c748d8a21" containerID="f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e" exitCode=2 Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253023 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253039 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253014 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253021 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerDied","Data":"f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253082 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7"} Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253091 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253184 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253326 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253558 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.253648 4830 scope.go:117] "RemoveContainer" containerID="f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.253866 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-sh485_openshift-multus(bdccedf8-f580-49f0-848e-108c748d8a21)\"" pod="openshift-multus/multus-sh485" podUID="bdccedf8-f580-49f0-848e-108c748d8a21" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.254055 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.254062 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.254365 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.299879 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5vgkl"] Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.304482 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5vgkl"] Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-systemd-units\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314251 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-etc-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314275 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314298 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-kubelet\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314313 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-config\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztj2v\" (UniqueName: \"kubernetes.io/projected/7a66362d-4afc-4ad8-9282-6d5b388ba250-kube-api-access-ztj2v\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314367 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovn-node-metrics-cert\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314390 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-netns\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314410 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-env-overrides\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314424 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-node-log\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314468 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-script-lib\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314524 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-slash\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314555 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-netd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-bin\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314612 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-var-lib-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314627 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-ovn\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314643 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-log-socket\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-systemd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314707 4830 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a18320-6162-4fc5-a89c-363c4c6cd030-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314719 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sktrh\" (UniqueName: \"kubernetes.io/projected/44a18320-6162-4fc5-a89c-363c4c6cd030-kube-api-access-sktrh\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.314730 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a18320-6162-4fc5-a89c-363c4c6cd030-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.316967 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.337366 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(85ae04d736563447dd70a9538285f6a1f8da4ee8141712f9c58fd0c83a332833): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.337427 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(85ae04d736563447dd70a9538285f6a1f8da4ee8141712f9c58fd0c83a332833): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.337449 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(85ae04d736563447dd70a9538285f6a1f8da4ee8141712f9c58fd0c83a332833): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.337494 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(85ae04d736563447dd70a9538285f6a1f8da4ee8141712f9c58fd0c83a332833): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" podUID="eb61183f-1e00-4056-9cf6-d1503c208d29" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.348622 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(00ba6f6fba24ac9680b42c479b595f70ff04f17c7e93401ebea0414ec536a2d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.348670 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(00ba6f6fba24ac9680b42c479b595f70ff04f17c7e93401ebea0414ec536a2d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.348704 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(00ba6f6fba24ac9680b42c479b595f70ff04f17c7e93401ebea0414ec536a2d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.348739 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(00ba6f6fba24ac9680b42c479b595f70ff04f17c7e93401ebea0414ec536a2d1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" podUID="a167735b-f973-4627-b731-0d4ab1458916" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.357182 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(0c67d74cc19d91470dc6ac43862fde85decaae867f1339f7d2e536986d836588): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.357263 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(0c67d74cc19d91470dc6ac43862fde85decaae867f1339f7d2e536986d836588): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.357293 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(0c67d74cc19d91470dc6ac43862fde85decaae867f1339f7d2e536986d836588): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.357351 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(0c67d74cc19d91470dc6ac43862fde85decaae867f1339f7d2e536986d836588): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" podUID="35bdd835-3ab4-4828-bd70-6d3f0df5131f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.372099 4830 scope.go:117] "RemoveContainer" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.381009 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(8cd8db2924c16078c46ea0af7f22e58feb3e08c01e4079e98baa4f1ad956c895): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.381079 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(8cd8db2924c16078c46ea0af7f22e58feb3e08c01e4079e98baa4f1ad956c895): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.381103 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(8cd8db2924c16078c46ea0af7f22e58feb3e08c01e4079e98baa4f1ad956c895): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.381152 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(8cd8db2924c16078c46ea0af7f22e58feb3e08c01e4079e98baa4f1ad956c895): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" podUID="17fb9e38-36ef-4709-8d72-71e4ca6fa8ad" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.384403 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(42c7fbb52bd819e93fed87512c484600f0e19a80f8f463c14958d52ed3d1029f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.384457 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(42c7fbb52bd819e93fed87512c484600f0e19a80f8f463c14958d52ed3d1029f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.384475 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(42c7fbb52bd819e93fed87512c484600f0e19a80f8f463c14958d52ed3d1029f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.384530 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(42c7fbb52bd819e93fed87512c484600f0e19a80f8f463c14958d52ed3d1029f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" podUID="90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.390147 4830 scope.go:117] "RemoveContainer" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.402158 4830 scope.go:117] "RemoveContainer" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.414728 4830 scope.go:117] "RemoveContainer" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415264 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-slash\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-netd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415362 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-bin\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415388 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-bin\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-var-lib-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415440 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-var-lib-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-cni-netd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-slash\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415468 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-ovn\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415485 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-ovn\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-log-socket\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-systemd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-systemd-units\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-etc-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415706 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-kubelet\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-config\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415750 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztj2v\" (UniqueName: \"kubernetes.io/projected/7a66362d-4afc-4ad8-9282-6d5b388ba250-kube-api-access-ztj2v\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415768 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovn-node-metrics-cert\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415790 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-netns\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-env-overrides\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415833 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-node-log\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415864 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-node-log\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415873 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-script-lib\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-run-systemd\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.415959 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416103 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-kubelet\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416188 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-etc-openvswitch\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-systemd-units\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416473 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-host-run-netns\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416593 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a66362d-4afc-4ad8-9282-6d5b388ba250-log-socket\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.416668 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-config\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.417028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-env-overrides\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.417752 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovnkube-script-lib\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.420537 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a66362d-4afc-4ad8-9282-6d5b388ba250-ovn-node-metrics-cert\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.434064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztj2v\" (UniqueName: \"kubernetes.io/projected/7a66362d-4afc-4ad8-9282-6d5b388ba250-kube-api-access-ztj2v\") pod \"ovnkube-node-5w22q\" (UID: \"7a66362d-4afc-4ad8-9282-6d5b388ba250\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.436544 4830 scope.go:117] "RemoveContainer" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.457254 4830 scope.go:117] "RemoveContainer" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.474913 4830 scope.go:117] "RemoveContainer" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.491027 4830 scope.go:117] "RemoveContainer" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.507698 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.508346 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.508382 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} err="failed to get container status \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.508403 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.508906 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": container with ID starting with 1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261 not found: ID does not exist" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.508936 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} err="failed to get container status \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": rpc error: code = NotFound desc = could not find container \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": container with ID starting with 1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.508954 4830 scope.go:117] "RemoveContainer" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.509291 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": container with ID starting with 0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3 not found: ID does not exist" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.509312 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} err="failed to get container status \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": rpc error: code = NotFound desc = could not find container \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": container with ID starting with 0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.509325 4830 scope.go:117] "RemoveContainer" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.509751 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": container with ID starting with 065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d not found: ID does not exist" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.509777 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} err="failed to get container status \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": rpc error: code = NotFound desc = could not find container \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": container with ID starting with 065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.509790 4830 scope.go:117] "RemoveContainer" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.511019 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": container with ID starting with 191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600 not found: ID does not exist" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.511079 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} err="failed to get container status \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": rpc error: code = NotFound desc = could not find container \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": container with ID starting with 191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.511114 4830 scope.go:117] "RemoveContainer" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.511449 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": container with ID starting with 6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba not found: ID does not exist" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.511472 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} err="failed to get container status \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": rpc error: code = NotFound desc = could not find container \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": container with ID starting with 6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.511490 4830 scope.go:117] "RemoveContainer" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.511774 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.512229 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": container with ID starting with 544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483 not found: ID does not exist" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.512258 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} err="failed to get container status \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": rpc error: code = NotFound desc = could not find container \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": container with ID starting with 544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.512276 4830 scope.go:117] "RemoveContainer" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.513300 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": container with ID starting with ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8 not found: ID does not exist" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.513321 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} err="failed to get container status \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": rpc error: code = NotFound desc = could not find container \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": container with ID starting with ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.513338 4830 scope.go:117] "RemoveContainer" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.513668 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": container with ID starting with 12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe not found: ID does not exist" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.513692 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} err="failed to get container status \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": rpc error: code = NotFound desc = could not find container \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": container with ID starting with 12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.513709 4830 scope.go:117] "RemoveContainer" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: E1203 22:17:02.514189 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": container with ID starting with 742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b not found: ID does not exist" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514214 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} err="failed to get container status \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": rpc error: code = NotFound desc = could not find container \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": container with ID starting with 742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514233 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514431 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} err="failed to get container status \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514449 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514707 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} err="failed to get container status \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": rpc error: code = NotFound desc = could not find container \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": container with ID starting with 1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.514734 4830 scope.go:117] "RemoveContainer" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.515793 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} err="failed to get container status \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": rpc error: code = NotFound desc = could not find container \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": container with ID starting with 0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.515831 4830 scope.go:117] "RemoveContainer" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.518640 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} err="failed to get container status \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": rpc error: code = NotFound desc = could not find container \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": container with ID starting with 065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.518673 4830 scope.go:117] "RemoveContainer" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.519081 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} err="failed to get container status \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": rpc error: code = NotFound desc = could not find container \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": container with ID starting with 191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.519105 4830 scope.go:117] "RemoveContainer" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.519749 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} err="failed to get container status \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": rpc error: code = NotFound desc = could not find container \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": container with ID starting with 6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.519790 4830 scope.go:117] "RemoveContainer" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.520593 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} err="failed to get container status \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": rpc error: code = NotFound desc = could not find container \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": container with ID starting with 544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.520625 4830 scope.go:117] "RemoveContainer" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.521648 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} err="failed to get container status \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": rpc error: code = NotFound desc = could not find container \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": container with ID starting with ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.521679 4830 scope.go:117] "RemoveContainer" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523049 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} err="failed to get container status \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": rpc error: code = NotFound desc = could not find container \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": container with ID starting with 12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523077 4830 scope.go:117] "RemoveContainer" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523343 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} err="failed to get container status \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": rpc error: code = NotFound desc = could not find container \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": container with ID starting with 742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523364 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523635 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} err="failed to get container status \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523653 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523890 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} err="failed to get container status \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": rpc error: code = NotFound desc = could not find container \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": container with ID starting with 1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.523907 4830 scope.go:117] "RemoveContainer" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.524162 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} err="failed to get container status \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": rpc error: code = NotFound desc = could not find container \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": container with ID starting with 0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.524179 4830 scope.go:117] "RemoveContainer" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.526943 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} err="failed to get container status \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": rpc error: code = NotFound desc = could not find container \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": container with ID starting with 065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.526981 4830 scope.go:117] "RemoveContainer" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.527301 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} err="failed to get container status \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": rpc error: code = NotFound desc = could not find container \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": container with ID starting with 191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.527320 4830 scope.go:117] "RemoveContainer" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.527864 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} err="failed to get container status \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": rpc error: code = NotFound desc = could not find container \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": container with ID starting with 6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.527887 4830 scope.go:117] "RemoveContainer" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.528363 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} err="failed to get container status \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": rpc error: code = NotFound desc = could not find container \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": container with ID starting with 544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.528383 4830 scope.go:117] "RemoveContainer" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.529825 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} err="failed to get container status \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": rpc error: code = NotFound desc = could not find container \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": container with ID starting with ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.529845 4830 scope.go:117] "RemoveContainer" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.532015 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} err="failed to get container status \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": rpc error: code = NotFound desc = could not find container \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": container with ID starting with 12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.532041 4830 scope.go:117] "RemoveContainer" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.534755 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} err="failed to get container status \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": rpc error: code = NotFound desc = could not find container \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": container with ID starting with 742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.534808 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.535418 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} err="failed to get container status \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.535456 4830 scope.go:117] "RemoveContainer" containerID="1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.535923 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261"} err="failed to get container status \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": rpc error: code = NotFound desc = could not find container \"1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261\": container with ID starting with 1294be0920d85cd94a29e677d88517b0b64d03fb54dbcd4d082bd43a02aec261 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.535954 4830 scope.go:117] "RemoveContainer" containerID="0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.536305 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3"} err="failed to get container status \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": rpc error: code = NotFound desc = could not find container \"0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3\": container with ID starting with 0a0643f2f6133f976af5a5b4656cb678316975b7ff7afffb2f42890cb48299d3 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.536350 4830 scope.go:117] "RemoveContainer" containerID="065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.537297 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d"} err="failed to get container status \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": rpc error: code = NotFound desc = could not find container \"065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d\": container with ID starting with 065d6e81dc6461cf45cabef61e6abba22ac358ef4bfc954744710c3510a2c12d not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.537319 4830 scope.go:117] "RemoveContainer" containerID="191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.537686 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600"} err="failed to get container status \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": rpc error: code = NotFound desc = could not find container \"191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600\": container with ID starting with 191627d44988c6485ae66970092fb5fde0b6b45d3a04304900c33d8fc3107600 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.537721 4830 scope.go:117] "RemoveContainer" containerID="6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538049 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba"} err="failed to get container status \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": rpc error: code = NotFound desc = could not find container \"6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba\": container with ID starting with 6b5d1b068c79b168637f562c375c84101c2af9b87b9ce02bf8f084dc159834ba not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538081 4830 scope.go:117] "RemoveContainer" containerID="544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538346 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483"} err="failed to get container status \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": rpc error: code = NotFound desc = could not find container \"544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483\": container with ID starting with 544d701c986ae69466c6191fa1345327429a8d728e7ae9ef6249b870cba41483 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538376 4830 scope.go:117] "RemoveContainer" containerID="ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538699 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8"} err="failed to get container status \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": rpc error: code = NotFound desc = could not find container \"ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8\": container with ID starting with ce221710f8b2a312f4afe8190812b0f65b3d00822ef53f646aae165ca565cfe8 not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.538728 4830 scope.go:117] "RemoveContainer" containerID="12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.539041 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe"} err="failed to get container status \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": rpc error: code = NotFound desc = could not find container \"12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe\": container with ID starting with 12e8d65a61dd6a8673c7ec4bb78e74a9f8c58ab9c5cbf16ee7137c7c43f576fe not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.539063 4830 scope.go:117] "RemoveContainer" containerID="742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.539417 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b"} err="failed to get container status \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": rpc error: code = NotFound desc = could not find container \"742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b\": container with ID starting with 742a1fe796ef0d14a1fe040ab93469b186eba86a8c54b4180be1597a2ca1d06b not found: ID does not exist" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.539443 4830 scope.go:117] "RemoveContainer" containerID="237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f" Dec 03 22:17:02 crc kubenswrapper[4830]: I1203 22:17:02.539816 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f"} err="failed to get container status \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": rpc error: code = NotFound desc = could not find container \"237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f\": container with ID starting with 237c7b414e9360084f7491a6b18ac8d8585afa7078d29ce7d5ba1d3e99afff8f not found: ID does not exist" Dec 03 22:17:03 crc kubenswrapper[4830]: I1203 22:17:03.260777 4830 generic.go:334] "Generic (PLEG): container finished" podID="7a66362d-4afc-4ad8-9282-6d5b388ba250" containerID="0bb5f20822e5f35162999466d863bf68df9fb5d5e1f6a211468af679f9f1c025" exitCode=0 Dec 03 22:17:03 crc kubenswrapper[4830]: I1203 22:17:03.260905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerDied","Data":"0bb5f20822e5f35162999466d863bf68df9fb5d5e1f6a211468af679f9f1c025"} Dec 03 22:17:03 crc kubenswrapper[4830]: I1203 22:17:03.261151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"570d768a8abc309bd9c3d2c4534f09f5349195e6b4f5bfc1655b2489ff2dc557"} Dec 03 22:17:03 crc kubenswrapper[4830]: I1203 22:17:03.345925 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a18320-6162-4fc5-a89c-363c4c6cd030" path="/var/lib/kubelet/pods/44a18320-6162-4fc5-a89c-363c4c6cd030/volumes" Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270228 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"c8a49a562299eb83f965e623039ed40934f71d32643ce04f4b8c1e592ea1ec67"} Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270453 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"70dea2f45ea7c3e198efc72f22f51187497ccfc895de23c9998414035992f20c"} Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270464 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"0f35e456806e181221154bec34b5810d134c6273bc4014be15688d91234160df"} Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270473 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"fb1768015565128e21e6016a625d3a3e4b7e3096506d2332240133233e3c39af"} Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270481 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"35f5aab7e246a28124d4e4fa8b56c2eed082064892cd97f2f8503a0245ffba92"} Dec 03 22:17:04 crc kubenswrapper[4830]: I1203 22:17:04.270489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"d14ab2af4dad5c50754be70922f044e18648e73687a441cd5f2a103cf7eab060"} Dec 03 22:17:07 crc kubenswrapper[4830]: I1203 22:17:07.285431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"b0cd24a76f6ec5da2a7150921fd09a3ab0793351fdb7797bf4ccf56b4cf78455"} Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.298390 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" event={"ID":"7a66362d-4afc-4ad8-9282-6d5b388ba250","Type":"ContainerStarted","Data":"0cfd608107cd6f3350f8528ab255afa49ea97cdbc2109e86f7511b86dfefc162"} Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.298713 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.298729 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.298740 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.326083 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" podStartSLOduration=7.326065492 podStartE2EDuration="7.326065492s" podCreationTimestamp="2025-12-03 22:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:17:09.323887502 +0000 UTC m=+718.320348861" watchObservedRunningTime="2025-12-03 22:17:09.326065492 +0000 UTC m=+718.322526841" Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.332772 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:09 crc kubenswrapper[4830]: I1203 22:17:09.332992 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:11 crc kubenswrapper[4830]: I1203 22:17:11.695857 4830 scope.go:117] "RemoveContainer" containerID="d2c7fdabbc612decc826bbfd4b9d5a54a1f2ce04bfbb8bd829868f86ac0851d7" Dec 03 22:17:12 crc kubenswrapper[4830]: I1203 22:17:12.338535 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/2.log" Dec 03 22:17:13 crc kubenswrapper[4830]: I1203 22:17:13.336971 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:13 crc kubenswrapper[4830]: I1203 22:17:13.337673 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:13 crc kubenswrapper[4830]: E1203 22:17:13.387648 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(b97db084727b1ebe7d358d7be39e833f24035076d26d0059eaa613cf7322dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:13 crc kubenswrapper[4830]: E1203 22:17:13.387760 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(b97db084727b1ebe7d358d7be39e833f24035076d26d0059eaa613cf7322dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:13 crc kubenswrapper[4830]: E1203 22:17:13.387832 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(b97db084727b1ebe7d358d7be39e833f24035076d26d0059eaa613cf7322dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:13 crc kubenswrapper[4830]: E1203 22:17:13.387921 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(b97db084727b1ebe7d358d7be39e833f24035076d26d0059eaa613cf7322dd90): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" podUID="17fb9e38-36ef-4709-8d72-71e4ca6fa8ad" Dec 03 22:17:14 crc kubenswrapper[4830]: I1203 22:17:14.336008 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:14 crc kubenswrapper[4830]: I1203 22:17:14.336718 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:14 crc kubenswrapper[4830]: I1203 22:17:14.336721 4830 scope.go:117] "RemoveContainer" containerID="f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e" Dec 03 22:17:14 crc kubenswrapper[4830]: E1203 22:17:14.336875 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-sh485_openshift-multus(bdccedf8-f580-49f0-848e-108c748d8a21)\"" pod="openshift-multus/multus-sh485" podUID="bdccedf8-f580-49f0-848e-108c748d8a21" Dec 03 22:17:14 crc kubenswrapper[4830]: E1203 22:17:14.368638 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(cd2693ce76756f5fd4675affbb77971eecda28f7bfcfcdf4a506f94761f6c9cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:14 crc kubenswrapper[4830]: E1203 22:17:14.368721 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(cd2693ce76756f5fd4675affbb77971eecda28f7bfcfcdf4a506f94761f6c9cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:14 crc kubenswrapper[4830]: E1203 22:17:14.368745 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(cd2693ce76756f5fd4675affbb77971eecda28f7bfcfcdf4a506f94761f6c9cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:14 crc kubenswrapper[4830]: E1203 22:17:14.368795 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(cd2693ce76756f5fd4675affbb77971eecda28f7bfcfcdf4a506f94761f6c9cf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" podUID="35bdd835-3ab4-4828-bd70-6d3f0df5131f" Dec 03 22:17:15 crc kubenswrapper[4830]: I1203 22:17:15.338838 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:15 crc kubenswrapper[4830]: I1203 22:17:15.339194 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:15 crc kubenswrapper[4830]: E1203 22:17:15.371324 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(edab7a2a94ca8d82af18a2f34b203f9a2586aedf5ba369496169746b58d2b50f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:15 crc kubenswrapper[4830]: E1203 22:17:15.371394 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(edab7a2a94ca8d82af18a2f34b203f9a2586aedf5ba369496169746b58d2b50f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:15 crc kubenswrapper[4830]: E1203 22:17:15.371415 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(edab7a2a94ca8d82af18a2f34b203f9a2586aedf5ba369496169746b58d2b50f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:15 crc kubenswrapper[4830]: E1203 22:17:15.371456 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(edab7a2a94ca8d82af18a2f34b203f9a2586aedf5ba369496169746b58d2b50f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" podUID="90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85" Dec 03 22:17:17 crc kubenswrapper[4830]: I1203 22:17:17.336807 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:17 crc kubenswrapper[4830]: I1203 22:17:17.336843 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:17 crc kubenswrapper[4830]: I1203 22:17:17.337325 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:17 crc kubenswrapper[4830]: I1203 22:17:17.337491 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.384727 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0a06a6a940b4eefd2b807ea06f44f60b2a74d90fa9be769d35578a05ac080126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.385147 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0a06a6a940b4eefd2b807ea06f44f60b2a74d90fa9be769d35578a05ac080126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.385183 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0a06a6a940b4eefd2b807ea06f44f60b2a74d90fa9be769d35578a05ac080126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.385259 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(0a06a6a940b4eefd2b807ea06f44f60b2a74d90fa9be769d35578a05ac080126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" podUID="a167735b-f973-4627-b731-0d4ab1458916" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.392172 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(950f008f3b7907dc78dd484bdfe185c9f4b353ef70bc4f8e56c5933964ad1de8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.392249 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(950f008f3b7907dc78dd484bdfe185c9f4b353ef70bc4f8e56c5933964ad1de8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.392276 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(950f008f3b7907dc78dd484bdfe185c9f4b353ef70bc4f8e56c5933964ad1de8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:17 crc kubenswrapper[4830]: E1203 22:17:17.392335 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(950f008f3b7907dc78dd484bdfe185c9f4b353ef70bc4f8e56c5933964ad1de8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" podUID="eb61183f-1e00-4056-9cf6-d1503c208d29" Dec 03 22:17:27 crc kubenswrapper[4830]: I1203 22:17:27.336552 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:27 crc kubenswrapper[4830]: I1203 22:17:27.337368 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:27 crc kubenswrapper[4830]: I1203 22:17:27.336702 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:27 crc kubenswrapper[4830]: I1203 22:17:27.338418 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.383251 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(ffd7e96ced2237636cc8a4548b97e57288e613a9ceb7aa21276101911243715e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.383332 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(ffd7e96ced2237636cc8a4548b97e57288e613a9ceb7aa21276101911243715e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.383370 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(ffd7e96ced2237636cc8a4548b97e57288e613a9ceb7aa21276101911243715e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.383453 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators(35bdd835-3ab4-4828-bd70-6d3f0df5131f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_openshift-operators_35bdd835-3ab4-4828-bd70-6d3f0df5131f_0(ffd7e96ced2237636cc8a4548b97e57288e613a9ceb7aa21276101911243715e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" podUID="35bdd835-3ab4-4828-bd70-6d3f0df5131f" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.402406 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(ba87d954c3b2f96e7b7f9083e48ee958e80e98c2b1451b47e677d507f32889b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.402472 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(ba87d954c3b2f96e7b7f9083e48ee958e80e98c2b1451b47e677d507f32889b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.402497 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(ba87d954c3b2f96e7b7f9083e48ee958e80e98c2b1451b47e677d507f32889b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:27 crc kubenswrapper[4830]: E1203 22:17:27.402566 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators(17fb9e38-36ef-4709-8d72-71e4ca6fa8ad)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_openshift-operators_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad_0(ba87d954c3b2f96e7b7f9083e48ee958e80e98c2b1451b47e677d507f32889b5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" podUID="17fb9e38-36ef-4709-8d72-71e4ca6fa8ad" Dec 03 22:17:28 crc kubenswrapper[4830]: I1203 22:17:28.336595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:28 crc kubenswrapper[4830]: I1203 22:17:28.337294 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:28 crc kubenswrapper[4830]: E1203 22:17:28.360867 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(2e94b01eff6a0ce9201e070f9fbab706ddaa57ea0f85e9c0e046f89b584a198f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:28 crc kubenswrapper[4830]: E1203 22:17:28.360936 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(2e94b01eff6a0ce9201e070f9fbab706ddaa57ea0f85e9c0e046f89b584a198f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:28 crc kubenswrapper[4830]: E1203 22:17:28.360958 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(2e94b01eff6a0ce9201e070f9fbab706ddaa57ea0f85e9c0e046f89b584a198f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:28 crc kubenswrapper[4830]: E1203 22:17:28.361000 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-49j7v_openshift-operators(a167735b-f973-4627-b731-0d4ab1458916)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-49j7v_openshift-operators_a167735b-f973-4627-b731-0d4ab1458916_0(2e94b01eff6a0ce9201e070f9fbab706ddaa57ea0f85e9c0e046f89b584a198f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" podUID="a167735b-f973-4627-b731-0d4ab1458916" Dec 03 22:17:29 crc kubenswrapper[4830]: I1203 22:17:29.336900 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:29 crc kubenswrapper[4830]: I1203 22:17:29.336910 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:29 crc kubenswrapper[4830]: I1203 22:17:29.337480 4830 scope.go:117] "RemoveContainer" containerID="f31a89c57a557903807518ac7ebc8edd96c78ae625733ebde7f23a71a9214f2e" Dec 03 22:17:29 crc kubenswrapper[4830]: I1203 22:17:29.337651 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:29 crc kubenswrapper[4830]: I1203 22:17:29.337661 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.396782 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(6d938ba28997308cb65e5a84ec3310c9219887f4845329300ed6256a770d5799): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.396866 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(6d938ba28997308cb65e5a84ec3310c9219887f4845329300ed6256a770d5799): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.396892 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(6d938ba28997308cb65e5a84ec3310c9219887f4845329300ed6256a770d5799): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.396947 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-zjhqs_openshift-operators(90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-zjhqs_openshift-operators_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85_0(6d938ba28997308cb65e5a84ec3310c9219887f4845329300ed6256a770d5799): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" podUID="90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.403481 4830 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(7489949e3723bbc190c1b1f5a0dc4ca6e2fc68e4e880e2ac4475d1fe8e2eed5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.403578 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(7489949e3723bbc190c1b1f5a0dc4ca6e2fc68e4e880e2ac4475d1fe8e2eed5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.403607 4830 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(7489949e3723bbc190c1b1f5a0dc4ca6e2fc68e4e880e2ac4475d1fe8e2eed5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:29 crc kubenswrapper[4830]: E1203 22:17:29.403658 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators(eb61183f-1e00-4056-9cf6-d1503c208d29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-qzckh_openshift-operators_eb61183f-1e00-4056-9cf6-d1503c208d29_0(7489949e3723bbc190c1b1f5a0dc4ca6e2fc68e4e880e2ac4475d1fe8e2eed5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" podUID="eb61183f-1e00-4056-9cf6-d1503c208d29" Dec 03 22:17:30 crc kubenswrapper[4830]: I1203 22:17:30.446166 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sh485_bdccedf8-f580-49f0-848e-108c748d8a21/kube-multus/2.log" Dec 03 22:17:30 crc kubenswrapper[4830]: I1203 22:17:30.446557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sh485" event={"ID":"bdccedf8-f580-49f0-848e-108c748d8a21","Type":"ContainerStarted","Data":"0f804196ce4e224e807e7197441e0958c2845f2bb8b8571529daf7edf4e6df23"} Dec 03 22:17:32 crc kubenswrapper[4830]: I1203 22:17:32.547062 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w22q" Dec 03 22:17:38 crc kubenswrapper[4830]: I1203 22:17:38.336850 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:38 crc kubenswrapper[4830]: I1203 22:17:38.337940 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" Dec 03 22:17:38 crc kubenswrapper[4830]: I1203 22:17:38.557122 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl"] Dec 03 22:17:39 crc kubenswrapper[4830]: I1203 22:17:39.500844 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" event={"ID":"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad","Type":"ContainerStarted","Data":"e4d448f6a6607a250c71dea88a45f446ed27117dc8878bea4b5b04563d770fe4"} Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.336739 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.336804 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.337704 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.337733 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.583076 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt"] Dec 03 22:17:40 crc kubenswrapper[4830]: W1203 22:17:40.587880 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bdd835_3ab4_4828_bd70_6d3f0df5131f.slice/crio-a55e6e0166e9b964c34645c6fadb94f9aab8a2fc485f39907b09767644d3952c WatchSource:0}: Error finding container a55e6e0166e9b964c34645c6fadb94f9aab8a2fc485f39907b09767644d3952c: Status 404 returned error can't find the container with id a55e6e0166e9b964c34645c6fadb94f9aab8a2fc485f39907b09767644d3952c Dec 03 22:17:40 crc kubenswrapper[4830]: I1203 22:17:40.631117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zjhqs"] Dec 03 22:17:41 crc kubenswrapper[4830]: I1203 22:17:41.513559 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" event={"ID":"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85","Type":"ContainerStarted","Data":"3c8b2557319d3a0ba51d29bc7ce8f873d3881ae036c72f48bf763a6787385cdd"} Dec 03 22:17:41 crc kubenswrapper[4830]: I1203 22:17:41.514840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" event={"ID":"35bdd835-3ab4-4828-bd70-6d3f0df5131f","Type":"ContainerStarted","Data":"a55e6e0166e9b964c34645c6fadb94f9aab8a2fc485f39907b09767644d3952c"} Dec 03 22:17:42 crc kubenswrapper[4830]: I1203 22:17:42.336680 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:42 crc kubenswrapper[4830]: I1203 22:17:42.337451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:42 crc kubenswrapper[4830]: I1203 22:17:42.798862 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-49j7v"] Dec 03 22:17:42 crc kubenswrapper[4830]: W1203 22:17:42.816075 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda167735b_f973_4627_b731_0d4ab1458916.slice/crio-4855f220476855e9f6986d1385807d25d12ae6c57d02afbb2f89b28f29cff3fd WatchSource:0}: Error finding container 4855f220476855e9f6986d1385807d25d12ae6c57d02afbb2f89b28f29cff3fd: Status 404 returned error can't find the container with id 4855f220476855e9f6986d1385807d25d12ae6c57d02afbb2f89b28f29cff3fd Dec 03 22:17:43 crc kubenswrapper[4830]: I1203 22:17:43.529050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" event={"ID":"a167735b-f973-4627-b731-0d4ab1458916","Type":"ContainerStarted","Data":"4855f220476855e9f6986d1385807d25d12ae6c57d02afbb2f89b28f29cff3fd"} Dec 03 22:17:44 crc kubenswrapper[4830]: I1203 22:17:44.335974 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:44 crc kubenswrapper[4830]: I1203 22:17:44.336881 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" Dec 03 22:17:45 crc kubenswrapper[4830]: I1203 22:17:45.625383 4830 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 22:17:46 crc kubenswrapper[4830]: I1203 22:17:46.722098 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh"] Dec 03 22:17:47 crc kubenswrapper[4830]: I1203 22:17:47.552586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" event={"ID":"17fb9e38-36ef-4709-8d72-71e4ca6fa8ad","Type":"ContainerStarted","Data":"7c0e4d272bdb37d707b0f7788acbb5db3c84ea4a3180da87f89ee6e6fe7ada22"} Dec 03 22:17:47 crc kubenswrapper[4830]: I1203 22:17:47.556113 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" event={"ID":"35bdd835-3ab4-4828-bd70-6d3f0df5131f","Type":"ContainerStarted","Data":"990a8be0bc265b7f46e5ed69215f59b4c3ce2c8cf951b3a8aa4d5e1a3fc6b2b4"} Dec 03 22:17:47 crc kubenswrapper[4830]: I1203 22:17:47.570906 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl" podStartSLOduration=38.704830186 podStartE2EDuration="46.57088574s" podCreationTimestamp="2025-12-03 22:17:01 +0000 UTC" firstStartedPulling="2025-12-03 22:17:38.575322304 +0000 UTC m=+747.571783653" lastFinishedPulling="2025-12-03 22:17:46.441377858 +0000 UTC m=+755.437839207" observedRunningTime="2025-12-03 22:17:47.566176913 +0000 UTC m=+756.562638282" watchObservedRunningTime="2025-12-03 22:17:47.57088574 +0000 UTC m=+756.567347089" Dec 03 22:17:47 crc kubenswrapper[4830]: I1203 22:17:47.596815 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt" podStartSLOduration=40.717176301 podStartE2EDuration="46.596799822s" podCreationTimestamp="2025-12-03 22:17:01 +0000 UTC" firstStartedPulling="2025-12-03 22:17:40.590526365 +0000 UTC m=+749.586987714" lastFinishedPulling="2025-12-03 22:17:46.470149886 +0000 UTC m=+755.466611235" observedRunningTime="2025-12-03 22:17:47.595666891 +0000 UTC m=+756.592128240" watchObservedRunningTime="2025-12-03 22:17:47.596799822 +0000 UTC m=+756.593261171" Dec 03 22:17:47 crc kubenswrapper[4830]: W1203 22:17:47.929354 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb61183f_1e00_4056_9cf6_d1503c208d29.slice/crio-7c6bbf608e29bd1d759dfa74b28b350860a570a44302f73030de1b33d1c0d3ad WatchSource:0}: Error finding container 7c6bbf608e29bd1d759dfa74b28b350860a570a44302f73030de1b33d1c0d3ad: Status 404 returned error can't find the container with id 7c6bbf608e29bd1d759dfa74b28b350860a570a44302f73030de1b33d1c0d3ad Dec 03 22:17:48 crc kubenswrapper[4830]: I1203 22:17:48.562833 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" event={"ID":"eb61183f-1e00-4056-9cf6-d1503c208d29","Type":"ContainerStarted","Data":"7c6bbf608e29bd1d759dfa74b28b350860a570a44302f73030de1b33d1c0d3ad"} Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.578952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" event={"ID":"a167735b-f973-4627-b731-0d4ab1458916","Type":"ContainerStarted","Data":"76ae963b745df1374e7ddc958452eb2104598622e9b86315afa7833c0a3df579"} Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.579278 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.580911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" event={"ID":"90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85","Type":"ContainerStarted","Data":"d7739c7272b0bf981339e7f9669cd134798fa1979d48a514f6121de62ca2026a"} Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.581081 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.585865 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.602814 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-49j7v" podStartSLOduration=42.66433931 podStartE2EDuration="50.602797093s" podCreationTimestamp="2025-12-03 22:17:01 +0000 UTC" firstStartedPulling="2025-12-03 22:17:42.82144825 +0000 UTC m=+751.817909629" lastFinishedPulling="2025-12-03 22:17:50.759906053 +0000 UTC m=+759.756367412" observedRunningTime="2025-12-03 22:17:51.595946157 +0000 UTC m=+760.592407526" watchObservedRunningTime="2025-12-03 22:17:51.602797093 +0000 UTC m=+760.599258442" Dec 03 22:17:51 crc kubenswrapper[4830]: I1203 22:17:51.618591 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" podStartSLOduration=42.237089144 podStartE2EDuration="50.618574489s" podCreationTimestamp="2025-12-03 22:17:01 +0000 UTC" firstStartedPulling="2025-12-03 22:17:40.635159972 +0000 UTC m=+749.631621321" lastFinishedPulling="2025-12-03 22:17:49.016645297 +0000 UTC m=+758.013106666" observedRunningTime="2025-12-03 22:17:51.61567789 +0000 UTC m=+760.612139279" watchObservedRunningTime="2025-12-03 22:17:51.618574489 +0000 UTC m=+760.615035838" Dec 03 22:17:53 crc kubenswrapper[4830]: I1203 22:17:53.599186 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" event={"ID":"eb61183f-1e00-4056-9cf6-d1503c208d29","Type":"ContainerStarted","Data":"5d51930e64f2652c93e2a251f06dc6f8c222cdbc9e7cdb1d5c6852ea2f6fd032"} Dec 03 22:17:53 crc kubenswrapper[4830]: I1203 22:17:53.634851 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-qzckh" podStartSLOduration=48.925444292 podStartE2EDuration="53.634816108s" podCreationTimestamp="2025-12-03 22:17:00 +0000 UTC" firstStartedPulling="2025-12-03 22:17:47.931571607 +0000 UTC m=+756.928032976" lastFinishedPulling="2025-12-03 22:17:52.640943443 +0000 UTC m=+761.637404792" observedRunningTime="2025-12-03 22:17:53.628477277 +0000 UTC m=+762.624938646" watchObservedRunningTime="2025-12-03 22:17:53.634816108 +0000 UTC m=+762.631277517" Dec 03 22:17:56 crc kubenswrapper[4830]: I1203 22:17:56.681324 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:17:56 crc kubenswrapper[4830]: I1203 22:17:56.681454 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.516952 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5c749"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.517873 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.520254 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.520350 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.521834 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2rtcq" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.534217 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5c749"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.541878 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zbjs6"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.543082 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zbjs6" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.543076 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4wk\" (UniqueName: \"kubernetes.io/projected/d84ca934-7bba-4889-b4a6-feec21575832-kube-api-access-jf4wk\") pod \"cert-manager-cainjector-7f985d654d-5c749\" (UID: \"d84ca934-7bba-4889-b4a6-feec21575832\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.548612 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sr4sh"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.549459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.551012 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dlc2h" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.551982 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-g86kd" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.557210 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zbjs6"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.580636 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sr4sh"] Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.644125 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898mq\" (UniqueName: \"kubernetes.io/projected/812edfa4-0a35-4dce-b14c-3addb5812eb7-kube-api-access-898mq\") pod \"cert-manager-webhook-5655c58dd6-sr4sh\" (UID: \"812edfa4-0a35-4dce-b14c-3addb5812eb7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.644187 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2js\" (UniqueName: \"kubernetes.io/projected/361b0a92-2587-4f96-a941-f9c50fd46e10-kube-api-access-bt2js\") pod \"cert-manager-5b446d88c5-zbjs6\" (UID: \"361b0a92-2587-4f96-a941-f9c50fd46e10\") " pod="cert-manager/cert-manager-5b446d88c5-zbjs6" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.644254 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4wk\" (UniqueName: \"kubernetes.io/projected/d84ca934-7bba-4889-b4a6-feec21575832-kube-api-access-jf4wk\") pod \"cert-manager-cainjector-7f985d654d-5c749\" (UID: \"d84ca934-7bba-4889-b4a6-feec21575832\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.672287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4wk\" (UniqueName: \"kubernetes.io/projected/d84ca934-7bba-4889-b4a6-feec21575832-kube-api-access-jf4wk\") pod \"cert-manager-cainjector-7f985d654d-5c749\" (UID: \"d84ca934-7bba-4889-b4a6-feec21575832\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.745106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898mq\" (UniqueName: \"kubernetes.io/projected/812edfa4-0a35-4dce-b14c-3addb5812eb7-kube-api-access-898mq\") pod \"cert-manager-webhook-5655c58dd6-sr4sh\" (UID: \"812edfa4-0a35-4dce-b14c-3addb5812eb7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.745196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2js\" (UniqueName: \"kubernetes.io/projected/361b0a92-2587-4f96-a941-f9c50fd46e10-kube-api-access-bt2js\") pod \"cert-manager-5b446d88c5-zbjs6\" (UID: \"361b0a92-2587-4f96-a941-f9c50fd46e10\") " pod="cert-manager/cert-manager-5b446d88c5-zbjs6" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.767430 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898mq\" (UniqueName: \"kubernetes.io/projected/812edfa4-0a35-4dce-b14c-3addb5812eb7-kube-api-access-898mq\") pod \"cert-manager-webhook-5655c58dd6-sr4sh\" (UID: \"812edfa4-0a35-4dce-b14c-3addb5812eb7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.774460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2js\" (UniqueName: \"kubernetes.io/projected/361b0a92-2587-4f96-a941-f9c50fd46e10-kube-api-access-bt2js\") pod \"cert-manager-5b446d88c5-zbjs6\" (UID: \"361b0a92-2587-4f96-a941-f9c50fd46e10\") " pod="cert-manager/cert-manager-5b446d88c5-zbjs6" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.841199 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.857773 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zbjs6" Dec 03 22:17:57 crc kubenswrapper[4830]: I1203 22:17:57.870603 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.288614 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zbjs6"] Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.295650 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sr4sh"] Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.299136 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-5c749"] Dec 03 22:17:58 crc kubenswrapper[4830]: W1203 22:17:58.304083 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84ca934_7bba_4889_b4a6_feec21575832.slice/crio-94cf846c8f1d4d0ff362ffef8aa24383bd8159e9c7a1eca9292e6ec333ae25e3 WatchSource:0}: Error finding container 94cf846c8f1d4d0ff362ffef8aa24383bd8159e9c7a1eca9292e6ec333ae25e3: Status 404 returned error can't find the container with id 94cf846c8f1d4d0ff362ffef8aa24383bd8159e9c7a1eca9292e6ec333ae25e3 Dec 03 22:17:58 crc kubenswrapper[4830]: W1203 22:17:58.305985 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812edfa4_0a35_4dce_b14c_3addb5812eb7.slice/crio-9f03accace7c03b90b1174a49259e49f132deb17d8ee7f99dfe68208a9d064d4 WatchSource:0}: Error finding container 9f03accace7c03b90b1174a49259e49f132deb17d8ee7f99dfe68208a9d064d4: Status 404 returned error can't find the container with id 9f03accace7c03b90b1174a49259e49f132deb17d8ee7f99dfe68208a9d064d4 Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.633303 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" event={"ID":"d84ca934-7bba-4889-b4a6-feec21575832","Type":"ContainerStarted","Data":"94cf846c8f1d4d0ff362ffef8aa24383bd8159e9c7a1eca9292e6ec333ae25e3"} Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.634577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" event={"ID":"812edfa4-0a35-4dce-b14c-3addb5812eb7","Type":"ContainerStarted","Data":"9f03accace7c03b90b1174a49259e49f132deb17d8ee7f99dfe68208a9d064d4"} Dec 03 22:17:58 crc kubenswrapper[4830]: I1203 22:17:58.635442 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zbjs6" event={"ID":"361b0a92-2587-4f96-a941-f9c50fd46e10","Type":"ContainerStarted","Data":"61eb48efdc00e7c09610fc5237a119ccd45a6b822fdd344afe6daa17490ea945"} Dec 03 22:18:01 crc kubenswrapper[4830]: I1203 22:18:01.804691 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-zjhqs" Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.659822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zbjs6" event={"ID":"361b0a92-2587-4f96-a941-f9c50fd46e10","Type":"ContainerStarted","Data":"6e2b795033a3847a1f9f475adb802830afc1633fccf497726e443a7a2809886e"} Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.662001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" event={"ID":"d84ca934-7bba-4889-b4a6-feec21575832","Type":"ContainerStarted","Data":"9dca10de15308b398792620a4f0180b9ca1d4345243f08a292e4ebd3ec58276b"} Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.663915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" event={"ID":"812edfa4-0a35-4dce-b14c-3addb5812eb7","Type":"ContainerStarted","Data":"d0204f0893b9f4db5cacd2ca7b3bd262dfc33ee5698fee35099d632a48c7d96c"} Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.664093 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.708747 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-zbjs6" podStartSLOduration=1.7822562629999998 podStartE2EDuration="5.708732024s" podCreationTimestamp="2025-12-03 22:17:57 +0000 UTC" firstStartedPulling="2025-12-03 22:17:58.300546893 +0000 UTC m=+767.297008242" lastFinishedPulling="2025-12-03 22:18:02.227022634 +0000 UTC m=+771.223484003" observedRunningTime="2025-12-03 22:18:02.68269731 +0000 UTC m=+771.679158659" watchObservedRunningTime="2025-12-03 22:18:02.708732024 +0000 UTC m=+771.705193373" Dec 03 22:18:02 crc kubenswrapper[4830]: I1203 22:18:02.709387 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-5c749" podStartSLOduration=1.713994566 podStartE2EDuration="5.709383631s" podCreationTimestamp="2025-12-03 22:17:57 +0000 UTC" firstStartedPulling="2025-12-03 22:17:58.306164775 +0000 UTC m=+767.302626124" lastFinishedPulling="2025-12-03 22:18:02.30155384 +0000 UTC m=+771.298015189" observedRunningTime="2025-12-03 22:18:02.708009984 +0000 UTC m=+771.704471333" watchObservedRunningTime="2025-12-03 22:18:02.709383631 +0000 UTC m=+771.705844980" Dec 03 22:18:07 crc kubenswrapper[4830]: I1203 22:18:07.877083 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" Dec 03 22:18:07 crc kubenswrapper[4830]: I1203 22:18:07.897941 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-sr4sh" podStartSLOduration=6.979481947 podStartE2EDuration="10.897884758s" podCreationTimestamp="2025-12-03 22:17:57 +0000 UTC" firstStartedPulling="2025-12-03 22:17:58.308805057 +0000 UTC m=+767.305266406" lastFinishedPulling="2025-12-03 22:18:02.227207858 +0000 UTC m=+771.223669217" observedRunningTime="2025-12-03 22:18:02.724918742 +0000 UTC m=+771.721380091" watchObservedRunningTime="2025-12-03 22:18:07.897884758 +0000 UTC m=+776.894346137" Dec 03 22:18:26 crc kubenswrapper[4830]: I1203 22:18:26.681164 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:18:26 crc kubenswrapper[4830]: I1203 22:18:26.681847 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.794729 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5"] Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.796596 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.798936 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.813899 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5"] Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.939157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.939227 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:35 crc kubenswrapper[4830]: I1203 22:18:35.939254 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9rg\" (UniqueName: \"kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.040546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9rg\" (UniqueName: \"kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.040647 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.040706 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.041224 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.041618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.072640 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9rg\" (UniqueName: \"kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.118315 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.382876 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5"] Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.867257 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerID="1702a5d5cb0f853d6a49a1b9cafabe89ef46c7df83db50d250d491b581a581c0" exitCode=0 Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.867353 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" event={"ID":"b3a9c045-3aac-47e1-ae39-645d33985f37","Type":"ContainerDied","Data":"1702a5d5cb0f853d6a49a1b9cafabe89ef46c7df83db50d250d491b581a581c0"} Dec 03 22:18:36 crc kubenswrapper[4830]: I1203 22:18:36.867602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" event={"ID":"b3a9c045-3aac-47e1-ae39-645d33985f37","Type":"ContainerStarted","Data":"22ff2e86cbe8c371c5778828c14daca00332d81669dad3c1b5dccbc81ced7879"} Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.148707 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.149951 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.177332 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.271062 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.271139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.271330 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzh2d\" (UniqueName: \"kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.372261 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.372347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.372414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzh2d\" (UniqueName: \"kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.372871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.372880 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.392445 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzh2d\" (UniqueName: \"kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d\") pod \"redhat-operators-9hgbt\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.477339 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.683869 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:18:38 crc kubenswrapper[4830]: W1203 22:18:38.689686 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22651d28_167e_40b7_91d2_d8b9e5426ce5.slice/crio-7ff0ecdb953844b7744692dc6478e41e96f9825ae335e8072a2bce2bb8f21362 WatchSource:0}: Error finding container 7ff0ecdb953844b7744692dc6478e41e96f9825ae335e8072a2bce2bb8f21362: Status 404 returned error can't find the container with id 7ff0ecdb953844b7744692dc6478e41e96f9825ae335e8072a2bce2bb8f21362 Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.881333 4830 generic.go:334] "Generic (PLEG): container finished" podID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerID="7cf765524eeac6b89d35a30da28b51d77bc5c2ccd2a9659b37515972d43cd590" exitCode=0 Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.881397 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerDied","Data":"7cf765524eeac6b89d35a30da28b51d77bc5c2ccd2a9659b37515972d43cd590"} Dec 03 22:18:38 crc kubenswrapper[4830]: I1203 22:18:38.881434 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerStarted","Data":"7ff0ecdb953844b7744692dc6478e41e96f9825ae335e8072a2bce2bb8f21362"} Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.174475 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.175178 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.177819 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.180298 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.183119 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.284697 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgzw\" (UniqueName: \"kubernetes.io/projected/10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5-kube-api-access-hvgzw\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.284758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.385665 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgzw\" (UniqueName: \"kubernetes.io/projected/10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5-kube-api-access-hvgzw\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.385738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.394531 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.394571 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0644c6d62af1b4ce8782e533d792a204e9d707d0d349a0acb831f9c897d9d8e/globalmount\"" pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.415999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6ee7c1d2-66d9-4e00-89f0-cc44e6ac0f0e\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.421940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgzw\" (UniqueName: \"kubernetes.io/projected/10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5-kube-api-access-hvgzw\") pod \"minio\" (UID: \"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5\") " pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.486531 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.900459 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerStarted","Data":"6763faf6b9eac0e24df063bb63b6d01bc6c7352b47e5fe720702494688b7e78d"} Dec 03 22:18:39 crc kubenswrapper[4830]: I1203 22:18:39.969338 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 22:18:40 crc kubenswrapper[4830]: I1203 22:18:40.908409 4830 generic.go:334] "Generic (PLEG): container finished" podID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerID="6763faf6b9eac0e24df063bb63b6d01bc6c7352b47e5fe720702494688b7e78d" exitCode=0 Dec 03 22:18:40 crc kubenswrapper[4830]: I1203 22:18:40.908617 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerDied","Data":"6763faf6b9eac0e24df063bb63b6d01bc6c7352b47e5fe720702494688b7e78d"} Dec 03 22:18:40 crc kubenswrapper[4830]: I1203 22:18:40.909983 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5","Type":"ContainerStarted","Data":"7e2b978caf0636189cd4c50eb23d4250bb3fdf7036ec086cf8f3913862248c26"} Dec 03 22:18:43 crc kubenswrapper[4830]: I1203 22:18:43.930275 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"10f55df2-2d91-4b68-9fcf-2ae8cbf7b1c5","Type":"ContainerStarted","Data":"a3a9ae49e80a8b3af1a97db35079ebd2d17dba0339a93ec5c0165497dda0d8c0"} Dec 03 22:18:43 crc kubenswrapper[4830]: I1203 22:18:43.935880 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerStarted","Data":"ea999ac873909cf3556b33f0542b75736f85d6f6d1cb26b8833ad9cb2f52c458"} Dec 03 22:18:43 crc kubenswrapper[4830]: I1203 22:18:43.955461 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.718744776 podStartE2EDuration="7.955422579s" podCreationTimestamp="2025-12-03 22:18:36 +0000 UTC" firstStartedPulling="2025-12-03 22:18:39.983848353 +0000 UTC m=+808.980309702" lastFinishedPulling="2025-12-03 22:18:43.220526156 +0000 UTC m=+812.216987505" observedRunningTime="2025-12-03 22:18:43.945540797 +0000 UTC m=+812.942002206" watchObservedRunningTime="2025-12-03 22:18:43.955422579 +0000 UTC m=+812.951883968" Dec 03 22:18:47 crc kubenswrapper[4830]: I1203 22:18:47.957081 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerID="2b0ad56584356739213dee2991e788288568a97e52835479ecd13cdc1e602812" exitCode=0 Dec 03 22:18:47 crc kubenswrapper[4830]: I1203 22:18:47.957148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" event={"ID":"b3a9c045-3aac-47e1-ae39-645d33985f37","Type":"ContainerDied","Data":"2b0ad56584356739213dee2991e788288568a97e52835479ecd13cdc1e602812"} Dec 03 22:18:47 crc kubenswrapper[4830]: I1203 22:18:47.981089 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9hgbt" podStartSLOduration=5.659708203 podStartE2EDuration="9.981060312s" podCreationTimestamp="2025-12-03 22:18:38 +0000 UTC" firstStartedPulling="2025-12-03 22:18:38.88257282 +0000 UTC m=+807.879034169" lastFinishedPulling="2025-12-03 22:18:43.203924919 +0000 UTC m=+812.200386278" observedRunningTime="2025-12-03 22:18:43.984650743 +0000 UTC m=+812.981112122" watchObservedRunningTime="2025-12-03 22:18:47.981060312 +0000 UTC m=+816.977521711" Dec 03 22:18:48 crc kubenswrapper[4830]: I1203 22:18:48.477764 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:48 crc kubenswrapper[4830]: I1203 22:18:48.477809 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:48 crc kubenswrapper[4830]: I1203 22:18:48.964681 4830 generic.go:334] "Generic (PLEG): container finished" podID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerID="0fb0b0d1a1eea2d75167a2e3a58f1d6cc16cbebb9db5cb8c8190d4fd5f984174" exitCode=0 Dec 03 22:18:48 crc kubenswrapper[4830]: I1203 22:18:48.964722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" event={"ID":"b3a9c045-3aac-47e1-ae39-645d33985f37","Type":"ContainerDied","Data":"0fb0b0d1a1eea2d75167a2e3a58f1d6cc16cbebb9db5cb8c8190d4fd5f984174"} Dec 03 22:18:49 crc kubenswrapper[4830]: I1203 22:18:49.515981 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9hgbt" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="registry-server" probeResult="failure" output=< Dec 03 22:18:49 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 22:18:49 crc kubenswrapper[4830]: > Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.328276 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.420464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util\") pod \"b3a9c045-3aac-47e1-ae39-645d33985f37\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.420595 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle\") pod \"b3a9c045-3aac-47e1-ae39-645d33985f37\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.420625 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x9rg\" (UniqueName: \"kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg\") pod \"b3a9c045-3aac-47e1-ae39-645d33985f37\" (UID: \"b3a9c045-3aac-47e1-ae39-645d33985f37\") " Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.423493 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle" (OuterVolumeSpecName: "bundle") pod "b3a9c045-3aac-47e1-ae39-645d33985f37" (UID: "b3a9c045-3aac-47e1-ae39-645d33985f37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.428754 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg" (OuterVolumeSpecName: "kube-api-access-5x9rg") pod "b3a9c045-3aac-47e1-ae39-645d33985f37" (UID: "b3a9c045-3aac-47e1-ae39-645d33985f37"). InnerVolumeSpecName "kube-api-access-5x9rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.435954 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util" (OuterVolumeSpecName: "util") pod "b3a9c045-3aac-47e1-ae39-645d33985f37" (UID: "b3a9c045-3aac-47e1-ae39-645d33985f37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.523992 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.524056 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3a9c045-3aac-47e1-ae39-645d33985f37-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.524081 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x9rg\" (UniqueName: \"kubernetes.io/projected/b3a9c045-3aac-47e1-ae39-645d33985f37-kube-api-access-5x9rg\") on node \"crc\" DevicePath \"\"" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.981292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" event={"ID":"b3a9c045-3aac-47e1-ae39-645d33985f37","Type":"ContainerDied","Data":"22ff2e86cbe8c371c5778828c14daca00332d81669dad3c1b5dccbc81ced7879"} Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.981348 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5" Dec 03 22:18:50 crc kubenswrapper[4830]: I1203 22:18:50.981360 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ff2e86cbe8c371c5778828c14daca00332d81669dad3c1b5dccbc81ced7879" Dec 03 22:18:56 crc kubenswrapper[4830]: I1203 22:18:56.681464 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:18:56 crc kubenswrapper[4830]: I1203 22:18:56.681925 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:18:56 crc kubenswrapper[4830]: I1203 22:18:56.681990 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:18:56 crc kubenswrapper[4830]: I1203 22:18:56.682863 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:18:56 crc kubenswrapper[4830]: I1203 22:18:56.682945 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b" gracePeriod=600 Dec 03 22:18:58 crc kubenswrapper[4830]: I1203 22:18:58.524168 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:58 crc kubenswrapper[4830]: I1203 22:18:58.566187 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.049466 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b" exitCode=0 Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.050162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b"} Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.050207 4830 scope.go:117] "RemoveContainer" containerID="da3366248b70067b6bfe62a9e9986089d023f74aa981d5bd06326f97c22262eb" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.788962 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd"] Dec 03 22:18:59 crc kubenswrapper[4830]: E1203 22:18:59.789529 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="util" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.789546 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="util" Dec 03 22:18:59 crc kubenswrapper[4830]: E1203 22:18:59.789560 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="extract" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.789567 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="extract" Dec 03 22:18:59 crc kubenswrapper[4830]: E1203 22:18:59.789591 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="pull" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.789599 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="pull" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.789710 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a9c045-3aac-47e1-ae39-645d33985f37" containerName="extract" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.790566 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.792484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.803000 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd"] Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.950549 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.950637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q49sg\" (UniqueName: \"kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:18:59 crc kubenswrapper[4830]: I1203 22:18:59.950664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.051693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q49sg\" (UniqueName: \"kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.051746 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.051814 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.052434 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.052534 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.055824 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59"} Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.076570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q49sg\" (UniqueName: \"kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.107434 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.297910 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd"] Dec 03 22:19:00 crc kubenswrapper[4830]: W1203 22:19:00.298961 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba6a7d25_7c4b_4587_bfbe_8197f7be7eed.slice/crio-45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0 WatchSource:0}: Error finding container 45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0: Status 404 returned error can't find the container with id 45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0 Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.704174 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:19:00 crc kubenswrapper[4830]: I1203 22:19:00.704714 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9hgbt" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="registry-server" containerID="cri-o://ea999ac873909cf3556b33f0542b75736f85d6f6d1cb26b8833ad9cb2f52c458" gracePeriod=2 Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.063098 4830 generic.go:334] "Generic (PLEG): container finished" podID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerID="e65d070a06fef975da25ab135c88be5b7d4891d56d78afa73dbf250026b36b37" exitCode=0 Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.063184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" event={"ID":"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed","Type":"ContainerDied","Data":"e65d070a06fef975da25ab135c88be5b7d4891d56d78afa73dbf250026b36b37"} Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.063213 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" event={"ID":"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed","Type":"ContainerStarted","Data":"45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0"} Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.065092 4830 generic.go:334] "Generic (PLEG): container finished" podID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerID="ea999ac873909cf3556b33f0542b75736f85d6f6d1cb26b8833ad9cb2f52c458" exitCode=0 Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.065162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerDied","Data":"ea999ac873909cf3556b33f0542b75736f85d6f6d1cb26b8833ad9cb2f52c458"} Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.118979 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.267195 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content\") pod \"22651d28-167e-40b7-91d2-d8b9e5426ce5\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.267264 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities\") pod \"22651d28-167e-40b7-91d2-d8b9e5426ce5\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.267331 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzh2d\" (UniqueName: \"kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d\") pod \"22651d28-167e-40b7-91d2-d8b9e5426ce5\" (UID: \"22651d28-167e-40b7-91d2-d8b9e5426ce5\") " Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.268571 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities" (OuterVolumeSpecName: "utilities") pod "22651d28-167e-40b7-91d2-d8b9e5426ce5" (UID: "22651d28-167e-40b7-91d2-d8b9e5426ce5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.273742 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d" (OuterVolumeSpecName: "kube-api-access-tzh2d") pod "22651d28-167e-40b7-91d2-d8b9e5426ce5" (UID: "22651d28-167e-40b7-91d2-d8b9e5426ce5"). InnerVolumeSpecName "kube-api-access-tzh2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.369175 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.369204 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzh2d\" (UniqueName: \"kubernetes.io/projected/22651d28-167e-40b7-91d2-d8b9e5426ce5-kube-api-access-tzh2d\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.382660 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22651d28-167e-40b7-91d2-d8b9e5426ce5" (UID: "22651d28-167e-40b7-91d2-d8b9e5426ce5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.471120 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22651d28-167e-40b7-91d2-d8b9e5426ce5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.492189 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj"] Dec 03 22:19:01 crc kubenswrapper[4830]: E1203 22:19:01.492489 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="registry-server" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.492526 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="registry-server" Dec 03 22:19:01 crc kubenswrapper[4830]: E1203 22:19:01.492540 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="extract-utilities" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.492548 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="extract-utilities" Dec 03 22:19:01 crc kubenswrapper[4830]: E1203 22:19:01.492559 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="extract-content" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.492566 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="extract-content" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.492708 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" containerName="registry-server" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.493354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496230 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496308 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496450 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496638 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496747 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-f9c79" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.496827 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.547150 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj"] Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.575386 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aac5b4ce-2952-49d3-81bc-4ace758e5367-manager-config\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.575471 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79js\" (UniqueName: \"kubernetes.io/projected/aac5b4ce-2952-49d3-81bc-4ace758e5367-kube-api-access-b79js\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.575502 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-apiservice-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.575543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-webhook-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.575614 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.676703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aac5b4ce-2952-49d3-81bc-4ace758e5367-manager-config\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.676809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79js\" (UniqueName: \"kubernetes.io/projected/aac5b4ce-2952-49d3-81bc-4ace758e5367-kube-api-access-b79js\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.676834 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-apiservice-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.676857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-webhook-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.676888 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.677543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/aac5b4ce-2952-49d3-81bc-4ace758e5367-manager-config\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.683811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-apiservice-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.684759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-webhook-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.696981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aac5b4ce-2952-49d3-81bc-4ace758e5367-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.704413 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79js\" (UniqueName: \"kubernetes.io/projected/aac5b4ce-2952-49d3-81bc-4ace758e5367-kube-api-access-b79js\") pod \"loki-operator-controller-manager-7dcf7ddf84-vj9kj\" (UID: \"aac5b4ce-2952-49d3-81bc-4ace758e5367\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.807674 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:01 crc kubenswrapper[4830]: I1203 22:19:01.995554 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj"] Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.071939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hgbt" event={"ID":"22651d28-167e-40b7-91d2-d8b9e5426ce5","Type":"ContainerDied","Data":"7ff0ecdb953844b7744692dc6478e41e96f9825ae335e8072a2bce2bb8f21362"} Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.071979 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hgbt" Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.071999 4830 scope.go:117] "RemoveContainer" containerID="ea999ac873909cf3556b33f0542b75736f85d6f6d1cb26b8833ad9cb2f52c458" Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.096825 4830 scope.go:117] "RemoveContainer" containerID="6763faf6b9eac0e24df063bb63b6d01bc6c7352b47e5fe720702494688b7e78d" Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.108399 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.113639 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9hgbt"] Dec 03 22:19:02 crc kubenswrapper[4830]: I1203 22:19:02.119335 4830 scope.go:117] "RemoveContainer" containerID="7cf765524eeac6b89d35a30da28b51d77bc5c2ccd2a9659b37515972d43cd590" Dec 03 22:19:03 crc kubenswrapper[4830]: I1203 22:19:03.080298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" event={"ID":"aac5b4ce-2952-49d3-81bc-4ace758e5367","Type":"ContainerStarted","Data":"3b4b94a924204af5342464e3398bc4df46c39ffd00d5dcc13fb15e3a92163a94"} Dec 03 22:19:03 crc kubenswrapper[4830]: I1203 22:19:03.082396 4830 generic.go:334] "Generic (PLEG): container finished" podID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerID="ff05272c11923e58d727e3b2b8c45464c56dc51afe270451a2b2ed3928265ef2" exitCode=0 Dec 03 22:19:03 crc kubenswrapper[4830]: I1203 22:19:03.082483 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" event={"ID":"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed","Type":"ContainerDied","Data":"ff05272c11923e58d727e3b2b8c45464c56dc51afe270451a2b2ed3928265ef2"} Dec 03 22:19:03 crc kubenswrapper[4830]: I1203 22:19:03.346610 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22651d28-167e-40b7-91d2-d8b9e5426ce5" path="/var/lib/kubelet/pods/22651d28-167e-40b7-91d2-d8b9e5426ce5/volumes" Dec 03 22:19:04 crc kubenswrapper[4830]: I1203 22:19:04.092803 4830 generic.go:334] "Generic (PLEG): container finished" podID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerID="d19d7364d66a2e282370f92ae3348b73a38cfbbec256fb380b7cd7434f2e3990" exitCode=0 Dec 03 22:19:04 crc kubenswrapper[4830]: I1203 22:19:04.092845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" event={"ID":"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed","Type":"ContainerDied","Data":"d19d7364d66a2e282370f92ae3348b73a38cfbbec256fb380b7cd7434f2e3990"} Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.749317 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.839567 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q49sg\" (UniqueName: \"kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg\") pod \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.839641 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle\") pod \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.839664 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util\") pod \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\" (UID: \"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed\") " Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.840991 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle" (OuterVolumeSpecName: "bundle") pod "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" (UID: "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.843707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg" (OuterVolumeSpecName: "kube-api-access-q49sg") pod "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" (UID: "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed"). InnerVolumeSpecName "kube-api-access-q49sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.853809 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util" (OuterVolumeSpecName: "util") pod "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" (UID: "ba6a7d25-7c4b-4587-bfbe-8197f7be7eed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.941134 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q49sg\" (UniqueName: \"kubernetes.io/projected/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-kube-api-access-q49sg\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.941165 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:05 crc kubenswrapper[4830]: I1203 22:19:05.941174 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba6a7d25-7c4b-4587-bfbe-8197f7be7eed-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:06 crc kubenswrapper[4830]: I1203 22:19:06.113436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" event={"ID":"aac5b4ce-2952-49d3-81bc-4ace758e5367","Type":"ContainerStarted","Data":"12b55423b125b5b2290cac13fd46c03c4bef8168a5f319c820e662bc5816569a"} Dec 03 22:19:06 crc kubenswrapper[4830]: I1203 22:19:06.116047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" event={"ID":"ba6a7d25-7c4b-4587-bfbe-8197f7be7eed","Type":"ContainerDied","Data":"45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0"} Dec 03 22:19:06 crc kubenswrapper[4830]: I1203 22:19:06.116089 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45e4892e5d1418bffb0d4008351291b88b463cb07c98d46a6f7a553bf3c5fab0" Dec 03 22:19:06 crc kubenswrapper[4830]: I1203 22:19:06.116099 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd" Dec 03 22:19:12 crc kubenswrapper[4830]: I1203 22:19:12.165739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" event={"ID":"aac5b4ce-2952-49d3-81bc-4ace758e5367","Type":"ContainerStarted","Data":"ca83fbdbfab1f77600c3b9c0fbf94da93dff90e5f390b928902cf5fea18884c7"} Dec 03 22:19:12 crc kubenswrapper[4830]: I1203 22:19:12.166565 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:12 crc kubenswrapper[4830]: I1203 22:19:12.169354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" Dec 03 22:19:12 crc kubenswrapper[4830]: I1203 22:19:12.204205 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7dcf7ddf84-vj9kj" podStartSLOduration=1.8783676489999999 podStartE2EDuration="11.204184716s" podCreationTimestamp="2025-12-03 22:19:01 +0000 UTC" firstStartedPulling="2025-12-03 22:19:02.083989779 +0000 UTC m=+831.080451128" lastFinishedPulling="2025-12-03 22:19:11.409806816 +0000 UTC m=+840.406268195" observedRunningTime="2025-12-03 22:19:12.199144537 +0000 UTC m=+841.195605906" watchObservedRunningTime="2025-12-03 22:19:12.204184716 +0000 UTC m=+841.200646075" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.032048 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:31 crc kubenswrapper[4830]: E1203 22:19:31.033128 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="extract" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.033152 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="extract" Dec 03 22:19:31 crc kubenswrapper[4830]: E1203 22:19:31.033189 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="util" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.033204 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="util" Dec 03 22:19:31 crc kubenswrapper[4830]: E1203 22:19:31.033226 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="pull" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.033242 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="pull" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.033424 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6a7d25-7c4b-4587-bfbe-8197f7be7eed" containerName="extract" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.034815 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.043948 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.233165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.233272 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.233304 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974bk\" (UniqueName: \"kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.334288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.334362 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.334383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974bk\" (UniqueName: \"kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.335085 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.335268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.361581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974bk\" (UniqueName: \"kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk\") pod \"community-operators-8m85s\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.365208 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:31 crc kubenswrapper[4830]: I1203 22:19:31.700321 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:32 crc kubenswrapper[4830]: I1203 22:19:32.308795 4830 generic.go:334] "Generic (PLEG): container finished" podID="fe45682b-25f6-43f2-a278-6170aaa84405" containerID="085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c" exitCode=0 Dec 03 22:19:32 crc kubenswrapper[4830]: I1203 22:19:32.308890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerDied","Data":"085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c"} Dec 03 22:19:32 crc kubenswrapper[4830]: I1203 22:19:32.309094 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerStarted","Data":"77853e77e0c18bc8a823d6c38819d20d35571aedb187aebfa7397e958bd27ee3"} Dec 03 22:19:35 crc kubenswrapper[4830]: I1203 22:19:34.329943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerStarted","Data":"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f"} Dec 03 22:19:35 crc kubenswrapper[4830]: I1203 22:19:35.343756 4830 generic.go:334] "Generic (PLEG): container finished" podID="fe45682b-25f6-43f2-a278-6170aaa84405" containerID="2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f" exitCode=0 Dec 03 22:19:35 crc kubenswrapper[4830]: I1203 22:19:35.356307 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerDied","Data":"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f"} Dec 03 22:19:36 crc kubenswrapper[4830]: I1203 22:19:36.354275 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerStarted","Data":"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28"} Dec 03 22:19:36 crc kubenswrapper[4830]: I1203 22:19:36.379177 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8m85s" podStartSLOduration=1.932233087 podStartE2EDuration="5.379154505s" podCreationTimestamp="2025-12-03 22:19:31 +0000 UTC" firstStartedPulling="2025-12-03 22:19:32.310993032 +0000 UTC m=+861.307454391" lastFinishedPulling="2025-12-03 22:19:35.75791445 +0000 UTC m=+864.754375809" observedRunningTime="2025-12-03 22:19:36.37531048 +0000 UTC m=+865.371771839" watchObservedRunningTime="2025-12-03 22:19:36.379154505 +0000 UTC m=+865.375615864" Dec 03 22:19:41 crc kubenswrapper[4830]: I1203 22:19:41.366712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:41 crc kubenswrapper[4830]: I1203 22:19:41.368612 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:41 crc kubenswrapper[4830]: I1203 22:19:41.429216 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:42 crc kubenswrapper[4830]: I1203 22:19:42.470978 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:42 crc kubenswrapper[4830]: I1203 22:19:42.529950 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.410189 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8m85s" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="registry-server" containerID="cri-o://56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28" gracePeriod=2 Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.826816 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.943094 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content\") pod \"fe45682b-25f6-43f2-a278-6170aaa84405\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.943233 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities\") pod \"fe45682b-25f6-43f2-a278-6170aaa84405\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.943336 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974bk\" (UniqueName: \"kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk\") pod \"fe45682b-25f6-43f2-a278-6170aaa84405\" (UID: \"fe45682b-25f6-43f2-a278-6170aaa84405\") " Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.944792 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities" (OuterVolumeSpecName: "utilities") pod "fe45682b-25f6-43f2-a278-6170aaa84405" (UID: "fe45682b-25f6-43f2-a278-6170aaa84405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:44 crc kubenswrapper[4830]: I1203 22:19:44.951854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk" (OuterVolumeSpecName: "kube-api-access-974bk") pod "fe45682b-25f6-43f2-a278-6170aaa84405" (UID: "fe45682b-25f6-43f2-a278-6170aaa84405"). InnerVolumeSpecName "kube-api-access-974bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.045610 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974bk\" (UniqueName: \"kubernetes.io/projected/fe45682b-25f6-43f2-a278-6170aaa84405-kube-api-access-974bk\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.045665 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.128403 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe45682b-25f6-43f2-a278-6170aaa84405" (UID: "fe45682b-25f6-43f2-a278-6170aaa84405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.146577 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe45682b-25f6-43f2-a278-6170aaa84405-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.419211 4830 generic.go:334] "Generic (PLEG): container finished" podID="fe45682b-25f6-43f2-a278-6170aaa84405" containerID="56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28" exitCode=0 Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.419271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerDied","Data":"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28"} Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.419307 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8m85s" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.419333 4830 scope.go:117] "RemoveContainer" containerID="56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.419317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8m85s" event={"ID":"fe45682b-25f6-43f2-a278-6170aaa84405","Type":"ContainerDied","Data":"77853e77e0c18bc8a823d6c38819d20d35571aedb187aebfa7397e958bd27ee3"} Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.452578 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.453962 4830 scope.go:117] "RemoveContainer" containerID="2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.466244 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8m85s"] Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.495809 4830 scope.go:117] "RemoveContainer" containerID="085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.521212 4830 scope.go:117] "RemoveContainer" containerID="56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28" Dec 03 22:19:45 crc kubenswrapper[4830]: E1203 22:19:45.521892 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28\": container with ID starting with 56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28 not found: ID does not exist" containerID="56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.522033 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28"} err="failed to get container status \"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28\": rpc error: code = NotFound desc = could not find container \"56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28\": container with ID starting with 56cf2df078ef0180c4aadcf888634e5bde9c804da7a2fc8da17bd80028703e28 not found: ID does not exist" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.522148 4830 scope.go:117] "RemoveContainer" containerID="2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f" Dec 03 22:19:45 crc kubenswrapper[4830]: E1203 22:19:45.522640 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f\": container with ID starting with 2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f not found: ID does not exist" containerID="2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.522693 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f"} err="failed to get container status \"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f\": rpc error: code = NotFound desc = could not find container \"2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f\": container with ID starting with 2ef248eab7e7d492f3f6fb9de10e123fd978ff658713a598966e519ad436c83f not found: ID does not exist" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.522732 4830 scope.go:117] "RemoveContainer" containerID="085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c" Dec 03 22:19:45 crc kubenswrapper[4830]: E1203 22:19:45.523604 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c\": container with ID starting with 085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c not found: ID does not exist" containerID="085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c" Dec 03 22:19:45 crc kubenswrapper[4830]: I1203 22:19:45.523651 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c"} err="failed to get container status \"085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c\": rpc error: code = NotFound desc = could not find container \"085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c\": container with ID starting with 085ccb4a5668b331a97f8f698d0236674fcfde936e11fe9c2e550eef79462b4c not found: ID does not exist" Dec 03 22:19:47 crc kubenswrapper[4830]: I1203 22:19:47.352006 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" path="/var/lib/kubelet/pods/fe45682b-25f6-43f2-a278-6170aaa84405/volumes" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.094585 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4"] Dec 03 22:19:49 crc kubenswrapper[4830]: E1203 22:19:49.095199 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="extract-utilities" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.095214 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="extract-utilities" Dec 03 22:19:49 crc kubenswrapper[4830]: E1203 22:19:49.095237 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="extract-content" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.095245 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="extract-content" Dec 03 22:19:49 crc kubenswrapper[4830]: E1203 22:19:49.095258 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="registry-server" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.095266 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="registry-server" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.095375 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe45682b-25f6-43f2-a278-6170aaa84405" containerName="registry-server" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.096354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.107398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4"] Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.149322 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.202040 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.202102 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.202157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hq75\" (UniqueName: \"kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.303005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.303073 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.303124 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hq75\" (UniqueName: \"kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.303549 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.303619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.325970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hq75\" (UniqueName: \"kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.455955 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:49 crc kubenswrapper[4830]: I1203 22:19:49.885883 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4"] Dec 03 22:19:50 crc kubenswrapper[4830]: I1203 22:19:50.449591 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerID="2f5cc4dc70aec3cdc37000fa1c86208dcb8ed8e1cc0a18e284c490255382db93" exitCode=0 Dec 03 22:19:50 crc kubenswrapper[4830]: I1203 22:19:50.449831 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" event={"ID":"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1","Type":"ContainerDied","Data":"2f5cc4dc70aec3cdc37000fa1c86208dcb8ed8e1cc0a18e284c490255382db93"} Dec 03 22:19:50 crc kubenswrapper[4830]: I1203 22:19:50.449887 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" event={"ID":"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1","Type":"ContainerStarted","Data":"8e95ab241c5374af44972b6fd131122ba6cabb27d64261ba136edc94109a5236"} Dec 03 22:19:52 crc kubenswrapper[4830]: I1203 22:19:52.467076 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerID="9fefdd240b59b67918615d8af477c1315da0e34a06beb86805e33a3d8a0a21ab" exitCode=0 Dec 03 22:19:52 crc kubenswrapper[4830]: I1203 22:19:52.467184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" event={"ID":"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1","Type":"ContainerDied","Data":"9fefdd240b59b67918615d8af477c1315da0e34a06beb86805e33a3d8a0a21ab"} Dec 03 22:19:53 crc kubenswrapper[4830]: I1203 22:19:53.479815 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerID="dd9161a805b335256cf1c0f91c1404bdd90d92b04b2ab9b1ec08c70671cb2712" exitCode=0 Dec 03 22:19:53 crc kubenswrapper[4830]: I1203 22:19:53.479876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" event={"ID":"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1","Type":"ContainerDied","Data":"dd9161a805b335256cf1c0f91c1404bdd90d92b04b2ab9b1ec08c70671cb2712"} Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.802087 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.878688 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle\") pod \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.878807 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util\") pod \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.878896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hq75\" (UniqueName: \"kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75\") pod \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\" (UID: \"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1\") " Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.879815 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle" (OuterVolumeSpecName: "bundle") pod "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" (UID: "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.890732 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75" (OuterVolumeSpecName: "kube-api-access-2hq75") pod "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" (UID: "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1"). InnerVolumeSpecName "kube-api-access-2hq75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.909003 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util" (OuterVolumeSpecName: "util") pod "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" (UID: "dd6b638e-c924-4e57-9e1f-ec4da2ef3db1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.980732 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.980797 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hq75\" (UniqueName: \"kubernetes.io/projected/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-kube-api-access-2hq75\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:54 crc kubenswrapper[4830]: I1203 22:19:54.980827 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd6b638e-c924-4e57-9e1f-ec4da2ef3db1-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:19:55 crc kubenswrapper[4830]: I1203 22:19:55.501548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" event={"ID":"dd6b638e-c924-4e57-9e1f-ec4da2ef3db1","Type":"ContainerDied","Data":"8e95ab241c5374af44972b6fd131122ba6cabb27d64261ba136edc94109a5236"} Dec 03 22:19:55 crc kubenswrapper[4830]: I1203 22:19:55.501617 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e95ab241c5374af44972b6fd131122ba6cabb27d64261ba136edc94109a5236" Dec 03 22:19:55 crc kubenswrapper[4830]: I1203 22:19:55.502067 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.513440 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz"] Dec 03 22:19:58 crc kubenswrapper[4830]: E1203 22:19:58.514875 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="pull" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.514974 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="pull" Dec 03 22:19:58 crc kubenswrapper[4830]: E1203 22:19:58.515064 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="extract" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.515143 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="extract" Dec 03 22:19:58 crc kubenswrapper[4830]: E1203 22:19:58.515227 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="util" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.515301 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="util" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.515498 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6b638e-c924-4e57-9e1f-ec4da2ef3db1" containerName="extract" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.516090 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.518836 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-p98bw" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.519110 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.519431 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.531289 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz"] Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.633350 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkbk\" (UniqueName: \"kubernetes.io/projected/de8f51ca-5084-4af1-87dc-715b869006d0-kube-api-access-5lkbk\") pod \"nmstate-operator-5b5b58f5c8-pncxz\" (UID: \"de8f51ca-5084-4af1-87dc-715b869006d0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.735358 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkbk\" (UniqueName: \"kubernetes.io/projected/de8f51ca-5084-4af1-87dc-715b869006d0-kube-api-access-5lkbk\") pod \"nmstate-operator-5b5b58f5c8-pncxz\" (UID: \"de8f51ca-5084-4af1-87dc-715b869006d0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.756380 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkbk\" (UniqueName: \"kubernetes.io/projected/de8f51ca-5084-4af1-87dc-715b869006d0-kube-api-access-5lkbk\") pod \"nmstate-operator-5b5b58f5c8-pncxz\" (UID: \"de8f51ca-5084-4af1-87dc-715b869006d0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" Dec 03 22:19:58 crc kubenswrapper[4830]: I1203 22:19:58.834459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" Dec 03 22:19:59 crc kubenswrapper[4830]: I1203 22:19:59.164623 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz"] Dec 03 22:19:59 crc kubenswrapper[4830]: I1203 22:19:59.532118 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" event={"ID":"de8f51ca-5084-4af1-87dc-715b869006d0","Type":"ContainerStarted","Data":"fc425221f53e77d63434b55eb441d7b54b0314143ad037a5e95ec1c8fce25fbc"} Dec 03 22:20:02 crc kubenswrapper[4830]: I1203 22:20:02.552482 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" event={"ID":"de8f51ca-5084-4af1-87dc-715b869006d0","Type":"ContainerStarted","Data":"9d5aa89b1c1cf59f9a27ea644419aca98b786595d5d6778d7bf471389f99f7fd"} Dec 03 22:20:02 crc kubenswrapper[4830]: I1203 22:20:02.598078 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pncxz" podStartSLOduration=2.001474668 podStartE2EDuration="4.598062478s" podCreationTimestamp="2025-12-03 22:19:58 +0000 UTC" firstStartedPulling="2025-12-03 22:19:59.175233363 +0000 UTC m=+888.171694712" lastFinishedPulling="2025-12-03 22:20:01.771821173 +0000 UTC m=+890.768282522" observedRunningTime="2025-12-03 22:20:02.596064963 +0000 UTC m=+891.592526322" watchObservedRunningTime="2025-12-03 22:20:02.598062478 +0000 UTC m=+891.594523827" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.676236 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.677890 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.680236 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.681065 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-szdq7" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.682099 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-krngn"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.682990 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.692109 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.733830 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-krngn"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.740954 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s66zt"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.741802 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805440 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8b2e328-2397-45e1-a593-5f5094799015-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-nmstate-lock\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805538 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcvr\" (UniqueName: \"kubernetes.io/projected/a8b2e328-2397-45e1-a593-5f5094799015-kube-api-access-qrcvr\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-dbus-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-ovs-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805618 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlbf\" (UniqueName: \"kubernetes.io/projected/22fd491b-eed1-4558-86bc-ae1f601fcdd0-kube-api-access-cjlbf\") pod \"nmstate-metrics-7f946cbc9-krngn\" (UID: \"22fd491b-eed1-4558-86bc-ae1f601fcdd0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.805634 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hdpx\" (UniqueName: \"kubernetes.io/projected/47b33d94-20d4-4640-acb2-c25aa2903bd1-kube-api-access-7hdpx\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.820238 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.820876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.824483 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.824867 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.825551 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-dcz7p" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.840638 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8"] Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/603c63d8-e5d2-428b-925d-aab17f1889dc-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907253 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlbf\" (UniqueName: \"kubernetes.io/projected/22fd491b-eed1-4558-86bc-ae1f601fcdd0-kube-api-access-cjlbf\") pod \"nmstate-metrics-7f946cbc9-krngn\" (UID: \"22fd491b-eed1-4558-86bc-ae1f601fcdd0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907274 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hdpx\" (UniqueName: \"kubernetes.io/projected/47b33d94-20d4-4640-acb2-c25aa2903bd1-kube-api-access-7hdpx\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/603c63d8-e5d2-428b-925d-aab17f1889dc-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907329 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8b2e328-2397-45e1-a593-5f5094799015-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-nmstate-lock\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907391 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcvr\" (UniqueName: \"kubernetes.io/projected/a8b2e328-2397-45e1-a593-5f5094799015-kube-api-access-qrcvr\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-dbus-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907453 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph5mx\" (UniqueName: \"kubernetes.io/projected/603c63d8-e5d2-428b-925d-aab17f1889dc-kube-api-access-ph5mx\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907491 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-ovs-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.907563 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-ovs-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.908410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-nmstate-lock\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.908817 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/47b33d94-20d4-4640-acb2-c25aa2903bd1-dbus-socket\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.916059 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8b2e328-2397-45e1-a593-5f5094799015-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.924314 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hdpx\" (UniqueName: \"kubernetes.io/projected/47b33d94-20d4-4640-acb2-c25aa2903bd1-kube-api-access-7hdpx\") pod \"nmstate-handler-s66zt\" (UID: \"47b33d94-20d4-4640-acb2-c25aa2903bd1\") " pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.925703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcvr\" (UniqueName: \"kubernetes.io/projected/a8b2e328-2397-45e1-a593-5f5094799015-kube-api-access-qrcvr\") pod \"nmstate-webhook-5f6d4c5ccb-8pqtc\" (UID: \"a8b2e328-2397-45e1-a593-5f5094799015\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.934358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlbf\" (UniqueName: \"kubernetes.io/projected/22fd491b-eed1-4558-86bc-ae1f601fcdd0-kube-api-access-cjlbf\") pod \"nmstate-metrics-7f946cbc9-krngn\" (UID: \"22fd491b-eed1-4558-86bc-ae1f601fcdd0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" Dec 03 22:20:03 crc kubenswrapper[4830]: I1203 22:20:03.999771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8966798f6-9txj7"] Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.000412 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.005426 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.009116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/603c63d8-e5d2-428b-925d-aab17f1889dc-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.009147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/603c63d8-e5d2-428b-925d-aab17f1889dc-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.009211 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph5mx\" (UniqueName: \"kubernetes.io/projected/603c63d8-e5d2-428b-925d-aab17f1889dc-kube-api-access-ph5mx\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.010415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/603c63d8-e5d2-428b-925d-aab17f1889dc-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.016528 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/603c63d8-e5d2-428b-925d-aab17f1889dc-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.016721 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.016990 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8966798f6-9txj7"] Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.045594 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph5mx\" (UniqueName: \"kubernetes.io/projected/603c63d8-e5d2-428b-925d-aab17f1889dc-kube-api-access-ph5mx\") pod \"nmstate-console-plugin-7fbb5f6569-8dnq8\" (UID: \"603c63d8-e5d2-428b-925d-aab17f1889dc\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.060650 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:04 crc kubenswrapper[4830]: W1203 22:20:04.083689 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b33d94_20d4_4640_acb2_c25aa2903bd1.slice/crio-144ed4f64ca2e25264493272813f5b99c2dcb2221b1b49ebbd1305518890c7bb WatchSource:0}: Error finding container 144ed4f64ca2e25264493272813f5b99c2dcb2221b1b49ebbd1305518890c7bb: Status 404 returned error can't find the container with id 144ed4f64ca2e25264493272813f5b99c2dcb2221b1b49ebbd1305518890c7bb Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110132 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-trusted-ca-bundle\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110389 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-oauth-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-service-ca\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110441 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110462 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-oauth-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110480 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.110538 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6lb\" (UniqueName: \"kubernetes.io/projected/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-kube-api-access-7b6lb\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.135020 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-oauth-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212439 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-trusted-ca-bundle\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212458 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-service-ca\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-oauth-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.212571 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6lb\" (UniqueName: \"kubernetes.io/projected/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-kube-api-access-7b6lb\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.213726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-oauth-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.213988 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-trusted-ca-bundle\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.214429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.214465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-service-ca\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.217240 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-oauth-config\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.217969 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-console-serving-cert\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.224295 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc"] Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.236839 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6lb\" (UniqueName: \"kubernetes.io/projected/316c09d2-2c3a-43ac-99b0-20b2e9c967c5-kube-api-access-7b6lb\") pod \"console-8966798f6-9txj7\" (UID: \"316c09d2-2c3a-43ac-99b0-20b2e9c967c5\") " pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.339784 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8"] Dec 03 22:20:04 crc kubenswrapper[4830]: W1203 22:20:04.341030 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod603c63d8_e5d2_428b_925d_aab17f1889dc.slice/crio-7250c1eb303823721891c4679088acf5ab7b413352accefe29000989127e2c24 WatchSource:0}: Error finding container 7250c1eb303823721891c4679088acf5ab7b413352accefe29000989127e2c24: Status 404 returned error can't find the container with id 7250c1eb303823721891c4679088acf5ab7b413352accefe29000989127e2c24 Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.385102 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.507045 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-krngn"] Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.566195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" event={"ID":"603c63d8-e5d2-428b-925d-aab17f1889dc","Type":"ContainerStarted","Data":"7250c1eb303823721891c4679088acf5ab7b413352accefe29000989127e2c24"} Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.567794 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s66zt" event={"ID":"47b33d94-20d4-4640-acb2-c25aa2903bd1","Type":"ContainerStarted","Data":"144ed4f64ca2e25264493272813f5b99c2dcb2221b1b49ebbd1305518890c7bb"} Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.569215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" event={"ID":"a8b2e328-2397-45e1-a593-5f5094799015","Type":"ContainerStarted","Data":"ab885f2e3caba96ac52cb4a34d2bac18fe34b194861ac4c61eeae403afe8afad"} Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.572358 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" event={"ID":"22fd491b-eed1-4558-86bc-ae1f601fcdd0","Type":"ContainerStarted","Data":"3a77ec77005a175123048a7c3207a9c691d2dd88ee5aef741ce63d25fb193dcb"} Dec 03 22:20:04 crc kubenswrapper[4830]: W1203 22:20:04.617922 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316c09d2_2c3a_43ac_99b0_20b2e9c967c5.slice/crio-afc4264eb8c1e4b12e8cbdf47b5fcd9545255c55f98b4b940808adfec7b38944 WatchSource:0}: Error finding container afc4264eb8c1e4b12e8cbdf47b5fcd9545255c55f98b4b940808adfec7b38944: Status 404 returned error can't find the container with id afc4264eb8c1e4b12e8cbdf47b5fcd9545255c55f98b4b940808adfec7b38944 Dec 03 22:20:04 crc kubenswrapper[4830]: I1203 22:20:04.623832 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8966798f6-9txj7"] Dec 03 22:20:05 crc kubenswrapper[4830]: I1203 22:20:05.581169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8966798f6-9txj7" event={"ID":"316c09d2-2c3a-43ac-99b0-20b2e9c967c5","Type":"ContainerStarted","Data":"29e6e081d4d913fe6df4d8db8417b98cefc0230f15b2e5f4f76514234be6ef4c"} Dec 03 22:20:05 crc kubenswrapper[4830]: I1203 22:20:05.581446 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8966798f6-9txj7" event={"ID":"316c09d2-2c3a-43ac-99b0-20b2e9c967c5","Type":"ContainerStarted","Data":"afc4264eb8c1e4b12e8cbdf47b5fcd9545255c55f98b4b940808adfec7b38944"} Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.597168 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" event={"ID":"a8b2e328-2397-45e1-a593-5f5094799015","Type":"ContainerStarted","Data":"86aa7ed4d9f95505cc7792730b1bd07c838610389fa58d87aae1ccb4e1097bd0"} Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.597848 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.600739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" event={"ID":"22fd491b-eed1-4558-86bc-ae1f601fcdd0","Type":"ContainerStarted","Data":"13cd4fc9d983b25c354344f254907ffa5cacb4b199751b0e962ace33c8d439f7"} Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.603457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" event={"ID":"603c63d8-e5d2-428b-925d-aab17f1889dc","Type":"ContainerStarted","Data":"fbe8d6d50c30c52e88047d2e9f170413e0e6573c2c1ed4d3c29c88c4147037f8"} Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.605324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s66zt" event={"ID":"47b33d94-20d4-4640-acb2-c25aa2903bd1","Type":"ContainerStarted","Data":"2a54d0c8053add0e10d62552d76e60715422f9498e12c7c9ab8505ff43040237"} Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.605463 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.614671 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" podStartSLOduration=2.011739525 podStartE2EDuration="4.614651038s" podCreationTimestamp="2025-12-03 22:20:03 +0000 UTC" firstStartedPulling="2025-12-03 22:20:04.250978012 +0000 UTC m=+893.247439351" lastFinishedPulling="2025-12-03 22:20:06.853889485 +0000 UTC m=+895.850350864" observedRunningTime="2025-12-03 22:20:07.613375294 +0000 UTC m=+896.609836673" watchObservedRunningTime="2025-12-03 22:20:07.614651038 +0000 UTC m=+896.611112387" Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.618554 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8966798f6-9txj7" podStartSLOduration=4.618540376 podStartE2EDuration="4.618540376s" podCreationTimestamp="2025-12-03 22:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:20:05.596572088 +0000 UTC m=+894.593033497" watchObservedRunningTime="2025-12-03 22:20:07.618540376 +0000 UTC m=+896.615001725" Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.640184 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-8dnq8" podStartSLOduration=2.128755304 podStartE2EDuration="4.64013736s" podCreationTimestamp="2025-12-03 22:20:03 +0000 UTC" firstStartedPulling="2025-12-03 22:20:04.342703525 +0000 UTC m=+893.339164894" lastFinishedPulling="2025-12-03 22:20:06.854085561 +0000 UTC m=+895.850546950" observedRunningTime="2025-12-03 22:20:07.629021434 +0000 UTC m=+896.625482803" watchObservedRunningTime="2025-12-03 22:20:07.64013736 +0000 UTC m=+896.636598719" Dec 03 22:20:07 crc kubenswrapper[4830]: I1203 22:20:07.653116 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s66zt" podStartSLOduration=1.885161562 podStartE2EDuration="4.653097677s" podCreationTimestamp="2025-12-03 22:20:03 +0000 UTC" firstStartedPulling="2025-12-03 22:20:04.085972411 +0000 UTC m=+893.082433750" lastFinishedPulling="2025-12-03 22:20:06.853908516 +0000 UTC m=+895.850369865" observedRunningTime="2025-12-03 22:20:07.650117305 +0000 UTC m=+896.646578664" watchObservedRunningTime="2025-12-03 22:20:07.653097677 +0000 UTC m=+896.649559026" Dec 03 22:20:09 crc kubenswrapper[4830]: I1203 22:20:09.622552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" event={"ID":"22fd491b-eed1-4558-86bc-ae1f601fcdd0","Type":"ContainerStarted","Data":"5ed2dac655f8535beeba8575e84ce579f6523f01379bbe4d5926342c28b2dd76"} Dec 03 22:20:09 crc kubenswrapper[4830]: I1203 22:20:09.657167 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-krngn" podStartSLOduration=2.060205358 podStartE2EDuration="6.657143571s" podCreationTimestamp="2025-12-03 22:20:03 +0000 UTC" firstStartedPulling="2025-12-03 22:20:04.516994431 +0000 UTC m=+893.513455820" lastFinishedPulling="2025-12-03 22:20:09.113932684 +0000 UTC m=+898.110394033" observedRunningTime="2025-12-03 22:20:09.646003475 +0000 UTC m=+898.642464844" watchObservedRunningTime="2025-12-03 22:20:09.657143571 +0000 UTC m=+898.653604910" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.606626 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.608313 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.627265 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.716308 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.716364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.716425 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jts7z\" (UniqueName: \"kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.817836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jts7z\" (UniqueName: \"kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.818207 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.818632 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.818688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.818965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.836553 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jts7z\" (UniqueName: \"kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z\") pod \"redhat-marketplace-m99l7\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:10 crc kubenswrapper[4830]: I1203 22:20:10.944342 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:11 crc kubenswrapper[4830]: I1203 22:20:11.366861 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:11 crc kubenswrapper[4830]: W1203 22:20:11.373673 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66963879_fe8a_433e_98fa_5f582ac05077.slice/crio-5d61b72d7748eaf105898eace2bb564bf1b9a8f021410862eee87646acc443d0 WatchSource:0}: Error finding container 5d61b72d7748eaf105898eace2bb564bf1b9a8f021410862eee87646acc443d0: Status 404 returned error can't find the container with id 5d61b72d7748eaf105898eace2bb564bf1b9a8f021410862eee87646acc443d0 Dec 03 22:20:11 crc kubenswrapper[4830]: I1203 22:20:11.638476 4830 generic.go:334] "Generic (PLEG): container finished" podID="66963879-fe8a-433e-98fa-5f582ac05077" containerID="def3da4214a5f1773bd0401590f98e2c16576bbb47452cd17d10b9b57afc5a69" exitCode=0 Dec 03 22:20:11 crc kubenswrapper[4830]: I1203 22:20:11.638538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerDied","Data":"def3da4214a5f1773bd0401590f98e2c16576bbb47452cd17d10b9b57afc5a69"} Dec 03 22:20:11 crc kubenswrapper[4830]: I1203 22:20:11.638583 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerStarted","Data":"5d61b72d7748eaf105898eace2bb564bf1b9a8f021410862eee87646acc443d0"} Dec 03 22:20:12 crc kubenswrapper[4830]: I1203 22:20:12.647979 4830 generic.go:334] "Generic (PLEG): container finished" podID="66963879-fe8a-433e-98fa-5f582ac05077" containerID="a8faa77df226a5577bd004eb0fe749ce4c576b375edf38715d635e57c6de29e7" exitCode=0 Dec 03 22:20:12 crc kubenswrapper[4830]: I1203 22:20:12.648050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerDied","Data":"a8faa77df226a5577bd004eb0fe749ce4c576b375edf38715d635e57c6de29e7"} Dec 03 22:20:13 crc kubenswrapper[4830]: I1203 22:20:13.655119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerStarted","Data":"804cce93cebce445e7557f71620aa606b7e5b7e38600b8dc423d42edfb593a18"} Dec 03 22:20:13 crc kubenswrapper[4830]: I1203 22:20:13.672619 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m99l7" podStartSLOduration=2.192763194 podStartE2EDuration="3.672600304s" podCreationTimestamp="2025-12-03 22:20:10 +0000 UTC" firstStartedPulling="2025-12-03 22:20:11.63990641 +0000 UTC m=+900.636367759" lastFinishedPulling="2025-12-03 22:20:13.11974351 +0000 UTC m=+902.116204869" observedRunningTime="2025-12-03 22:20:13.671102002 +0000 UTC m=+902.667563351" watchObservedRunningTime="2025-12-03 22:20:13.672600304 +0000 UTC m=+902.669061653" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.103537 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s66zt" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.385480 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.385581 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.392165 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.666423 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8966798f6-9txj7" Dec 03 22:20:14 crc kubenswrapper[4830]: I1203 22:20:14.713735 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:20:20 crc kubenswrapper[4830]: I1203 22:20:20.945034 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:20 crc kubenswrapper[4830]: I1203 22:20:20.945814 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:21 crc kubenswrapper[4830]: I1203 22:20:21.017330 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:21 crc kubenswrapper[4830]: I1203 22:20:21.780311 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:22 crc kubenswrapper[4830]: I1203 22:20:22.258009 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:23 crc kubenswrapper[4830]: I1203 22:20:23.724852 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m99l7" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="registry-server" containerID="cri-o://804cce93cebce445e7557f71620aa606b7e5b7e38600b8dc423d42edfb593a18" gracePeriod=2 Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.015410 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8pqtc" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.669133 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.670757 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.688677 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.725417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.725586 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.725773 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98zc\" (UniqueName: \"kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.734366 4830 generic.go:334] "Generic (PLEG): container finished" podID="66963879-fe8a-433e-98fa-5f582ac05077" containerID="804cce93cebce445e7557f71620aa606b7e5b7e38600b8dc423d42edfb593a18" exitCode=0 Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.734411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerDied","Data":"804cce93cebce445e7557f71620aa606b7e5b7e38600b8dc423d42edfb593a18"} Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.827433 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98zc\" (UniqueName: \"kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.827488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.827530 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.827999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.828476 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:24 crc kubenswrapper[4830]: I1203 22:20:24.852308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98zc\" (UniqueName: \"kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc\") pod \"certified-operators-qxzhr\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.000440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.463602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.740977 4830 generic.go:334] "Generic (PLEG): container finished" podID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerID="9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea" exitCode=0 Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.741012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerDied","Data":"9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea"} Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.741036 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerStarted","Data":"58b9d8fa1043e35a6ccb449aecbf03a06dd06cac40f4a343d23f504e52c64535"} Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.910115 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.939358 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jts7z\" (UniqueName: \"kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z\") pod \"66963879-fe8a-433e-98fa-5f582ac05077\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.939415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities\") pod \"66963879-fe8a-433e-98fa-5f582ac05077\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.939468 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content\") pod \"66963879-fe8a-433e-98fa-5f582ac05077\" (UID: \"66963879-fe8a-433e-98fa-5f582ac05077\") " Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.940720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities" (OuterVolumeSpecName: "utilities") pod "66963879-fe8a-433e-98fa-5f582ac05077" (UID: "66963879-fe8a-433e-98fa-5f582ac05077"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.945068 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z" (OuterVolumeSpecName: "kube-api-access-jts7z") pod "66963879-fe8a-433e-98fa-5f582ac05077" (UID: "66963879-fe8a-433e-98fa-5f582ac05077"). InnerVolumeSpecName "kube-api-access-jts7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:20:25 crc kubenswrapper[4830]: I1203 22:20:25.957151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66963879-fe8a-433e-98fa-5f582ac05077" (UID: "66963879-fe8a-433e-98fa-5f582ac05077"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.040945 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jts7z\" (UniqueName: \"kubernetes.io/projected/66963879-fe8a-433e-98fa-5f582ac05077-kube-api-access-jts7z\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.040976 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.040985 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66963879-fe8a-433e-98fa-5f582ac05077-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.755917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m99l7" event={"ID":"66963879-fe8a-433e-98fa-5f582ac05077","Type":"ContainerDied","Data":"5d61b72d7748eaf105898eace2bb564bf1b9a8f021410862eee87646acc443d0"} Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.755981 4830 scope.go:117] "RemoveContainer" containerID="804cce93cebce445e7557f71620aa606b7e5b7e38600b8dc423d42edfb593a18" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.756016 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m99l7" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.759120 4830 generic.go:334] "Generic (PLEG): container finished" podID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerID="de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db" exitCode=0 Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.759159 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerDied","Data":"de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db"} Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.772722 4830 scope.go:117] "RemoveContainer" containerID="a8faa77df226a5577bd004eb0fe749ce4c576b375edf38715d635e57c6de29e7" Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.802748 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.809338 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m99l7"] Dec 03 22:20:26 crc kubenswrapper[4830]: I1203 22:20:26.844956 4830 scope.go:117] "RemoveContainer" containerID="def3da4214a5f1773bd0401590f98e2c16576bbb47452cd17d10b9b57afc5a69" Dec 03 22:20:27 crc kubenswrapper[4830]: I1203 22:20:27.346311 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66963879-fe8a-433e-98fa-5f582ac05077" path="/var/lib/kubelet/pods/66963879-fe8a-433e-98fa-5f582ac05077/volumes" Dec 03 22:20:27 crc kubenswrapper[4830]: I1203 22:20:27.770069 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerStarted","Data":"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c"} Dec 03 22:20:27 crc kubenswrapper[4830]: I1203 22:20:27.790271 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxzhr" podStartSLOduration=2.161261249 podStartE2EDuration="3.790253499s" podCreationTimestamp="2025-12-03 22:20:24 +0000 UTC" firstStartedPulling="2025-12-03 22:20:25.742035383 +0000 UTC m=+914.738496732" lastFinishedPulling="2025-12-03 22:20:27.371027593 +0000 UTC m=+916.367488982" observedRunningTime="2025-12-03 22:20:27.786499117 +0000 UTC m=+916.782960466" watchObservedRunningTime="2025-12-03 22:20:27.790253499 +0000 UTC m=+916.786714848" Dec 03 22:20:35 crc kubenswrapper[4830]: I1203 22:20:35.001536 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:35 crc kubenswrapper[4830]: I1203 22:20:35.003336 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:35 crc kubenswrapper[4830]: I1203 22:20:35.046630 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:35 crc kubenswrapper[4830]: I1203 22:20:35.933692 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:35 crc kubenswrapper[4830]: I1203 22:20:35.995698 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:37 crc kubenswrapper[4830]: I1203 22:20:37.875767 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qxzhr" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="registry-server" containerID="cri-o://56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c" gracePeriod=2 Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.803493 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.883708 4830 generic.go:334] "Generic (PLEG): container finished" podID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerID="56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c" exitCode=0 Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.883763 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerDied","Data":"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c"} Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.883771 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxzhr" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.883790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxzhr" event={"ID":"e7e03c4b-cc52-435f-afe2-b3bf499f5677","Type":"ContainerDied","Data":"58b9d8fa1043e35a6ccb449aecbf03a06dd06cac40f4a343d23f504e52c64535"} Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.883810 4830 scope.go:117] "RemoveContainer" containerID="56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.908214 4830 scope.go:117] "RemoveContainer" containerID="de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.927029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content\") pod \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.927083 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities\") pod \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.927115 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t98zc\" (UniqueName: \"kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc\") pod \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\" (UID: \"e7e03c4b-cc52-435f-afe2-b3bf499f5677\") " Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.928270 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities" (OuterVolumeSpecName: "utilities") pod "e7e03c4b-cc52-435f-afe2-b3bf499f5677" (UID: "e7e03c4b-cc52-435f-afe2-b3bf499f5677"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.928399 4830 scope.go:117] "RemoveContainer" containerID="9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.938707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc" (OuterVolumeSpecName: "kube-api-access-t98zc") pod "e7e03c4b-cc52-435f-afe2-b3bf499f5677" (UID: "e7e03c4b-cc52-435f-afe2-b3bf499f5677"). InnerVolumeSpecName "kube-api-access-t98zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.974295 4830 scope.go:117] "RemoveContainer" containerID="56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c" Dec 03 22:20:38 crc kubenswrapper[4830]: E1203 22:20:38.974717 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c\": container with ID starting with 56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c not found: ID does not exist" containerID="56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.974759 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c"} err="failed to get container status \"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c\": rpc error: code = NotFound desc = could not find container \"56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c\": container with ID starting with 56ed67a43c96d97dfa789542a05a07b47860f35758d464125af52e7aaf42096c not found: ID does not exist" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.974783 4830 scope.go:117] "RemoveContainer" containerID="de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db" Dec 03 22:20:38 crc kubenswrapper[4830]: E1203 22:20:38.975231 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db\": container with ID starting with de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db not found: ID does not exist" containerID="de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.975261 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db"} err="failed to get container status \"de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db\": rpc error: code = NotFound desc = could not find container \"de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db\": container with ID starting with de9cc34b525aa693d676bd4e3e7fd3f681ff29dcbeb2ec7b1058b5b9ea7455db not found: ID does not exist" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.975279 4830 scope.go:117] "RemoveContainer" containerID="9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea" Dec 03 22:20:38 crc kubenswrapper[4830]: E1203 22:20:38.975857 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea\": container with ID starting with 9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea not found: ID does not exist" containerID="9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.975886 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea"} err="failed to get container status \"9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea\": rpc error: code = NotFound desc = could not find container \"9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea\": container with ID starting with 9ad110288029178bade9c4bae5d3eee901cadf61c6cb3cd9443fda35f5f146ea not found: ID does not exist" Dec 03 22:20:38 crc kubenswrapper[4830]: I1203 22:20:38.976642 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7e03c4b-cc52-435f-afe2-b3bf499f5677" (UID: "e7e03c4b-cc52-435f-afe2-b3bf499f5677"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.028387 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.028433 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e03c4b-cc52-435f-afe2-b3bf499f5677-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.028477 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t98zc\" (UniqueName: \"kubernetes.io/projected/e7e03c4b-cc52-435f-afe2-b3bf499f5677-kube-api-access-t98zc\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.213971 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.220090 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qxzhr"] Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.351164 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" path="/var/lib/kubelet/pods/e7e03c4b-cc52-435f-afe2-b3bf499f5677/volumes" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.770358 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-drfg4" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerName="console" containerID="cri-o://1e734944c3a0ac7aef3936378640e5338f592ef96fa9bf6958cbe649667ea342" gracePeriod=15 Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.897613 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-drfg4_9f6628e7-55ff-4c71-b3e7-102cb3b6954d/console/0.log" Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.897662 4830 generic.go:334] "Generic (PLEG): container finished" podID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerID="1e734944c3a0ac7aef3936378640e5338f592ef96fa9bf6958cbe649667ea342" exitCode=2 Dec 03 22:20:39 crc kubenswrapper[4830]: I1203 22:20:39.897697 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drfg4" event={"ID":"9f6628e7-55ff-4c71-b3e7-102cb3b6954d","Type":"ContainerDied","Data":"1e734944c3a0ac7aef3936378640e5338f592ef96fa9bf6958cbe649667ea342"} Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.149819 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-drfg4_9f6628e7-55ff-4c71-b3e7-102cb3b6954d/console/0.log" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.149903 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.245849 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.245902 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.245932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2f72\" (UniqueName: \"kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.245956 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246034 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246050 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246070 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert\") pod \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\" (UID: \"9f6628e7-55ff-4c71-b3e7-102cb3b6954d\") " Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246907 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246925 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config" (OuterVolumeSpecName: "console-config") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.246921 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.247261 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.255994 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.256030 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72" (OuterVolumeSpecName: "kube-api-access-b2f72") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "kube-api-access-b2f72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.256209 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f6628e7-55ff-4c71-b3e7-102cb3b6954d" (UID: "9f6628e7-55ff-4c71-b3e7-102cb3b6954d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347365 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347398 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347409 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347419 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347427 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347435 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2f72\" (UniqueName: \"kubernetes.io/projected/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-kube-api-access-b2f72\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.347442 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6628e7-55ff-4c71-b3e7-102cb3b6954d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.494205 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg"] Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.495188 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="extract-content" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.495349 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="extract-content" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.495467 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerName="console" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.495607 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerName="console" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.495713 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.495807 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.495905 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="extract-content" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.495997 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="extract-content" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.496099 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="extract-utilities" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.496184 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="extract-utilities" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.496285 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.496387 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: E1203 22:20:40.496556 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="extract-utilities" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.496661 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="extract-utilities" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.496948 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" containerName="console" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.497064 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e03c4b-cc52-435f-afe2-b3bf499f5677" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.497182 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="66963879-fe8a-433e-98fa-5f582ac05077" containerName="registry-server" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.498595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.501962 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg"] Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.502157 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.650439 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.650563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.650593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsvh7\" (UniqueName: \"kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.752210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.752762 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.753064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsvh7\" (UniqueName: \"kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.753000 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.752694 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.789569 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsvh7\" (UniqueName: \"kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.820280 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.914900 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-drfg4_9f6628e7-55ff-4c71-b3e7-102cb3b6954d/console/0.log" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.914952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drfg4" event={"ID":"9f6628e7-55ff-4c71-b3e7-102cb3b6954d","Type":"ContainerDied","Data":"64dd6dd0745667ff722ab0da64d6ad23a6aef43bb07231f8596f4cf9249029f2"} Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.914985 4830 scope.go:117] "RemoveContainer" containerID="1e734944c3a0ac7aef3936378640e5338f592ef96fa9bf6958cbe649667ea342" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.915104 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drfg4" Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.961578 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:20:40 crc kubenswrapper[4830]: I1203 22:20:40.966421 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-drfg4"] Dec 03 22:20:41 crc kubenswrapper[4830]: I1203 22:20:41.288780 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg"] Dec 03 22:20:41 crc kubenswrapper[4830]: I1203 22:20:41.361104 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6628e7-55ff-4c71-b3e7-102cb3b6954d" path="/var/lib/kubelet/pods/9f6628e7-55ff-4c71-b3e7-102cb3b6954d/volumes" Dec 03 22:20:41 crc kubenswrapper[4830]: I1203 22:20:41.926833 4830 generic.go:334] "Generic (PLEG): container finished" podID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerID="b02882cca440ee66891de4213110f43469d75b6672668010c284c4f0df65345e" exitCode=0 Dec 03 22:20:41 crc kubenswrapper[4830]: I1203 22:20:41.926876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" event={"ID":"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a","Type":"ContainerDied","Data":"b02882cca440ee66891de4213110f43469d75b6672668010c284c4f0df65345e"} Dec 03 22:20:41 crc kubenswrapper[4830]: I1203 22:20:41.926904 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" event={"ID":"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a","Type":"ContainerStarted","Data":"fbea1bfe0f61bd424c5fde0a2da3ab28d94e6d218316bda074c25f44bb635815"} Dec 03 22:20:43 crc kubenswrapper[4830]: I1203 22:20:43.949185 4830 generic.go:334] "Generic (PLEG): container finished" podID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerID="a889a1481073d35de0642c9ed6bf91f0f23eeefbb4b0ab2f8097461f76f03c48" exitCode=0 Dec 03 22:20:43 crc kubenswrapper[4830]: I1203 22:20:43.949235 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" event={"ID":"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a","Type":"ContainerDied","Data":"a889a1481073d35de0642c9ed6bf91f0f23eeefbb4b0ab2f8097461f76f03c48"} Dec 03 22:20:44 crc kubenswrapper[4830]: I1203 22:20:44.960753 4830 generic.go:334] "Generic (PLEG): container finished" podID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerID="6b46c3de84dda91ab0de75276987ff381250d8a372515d7ae99c67c695925bdb" exitCode=0 Dec 03 22:20:44 crc kubenswrapper[4830]: I1203 22:20:44.961099 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" event={"ID":"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a","Type":"ContainerDied","Data":"6b46c3de84dda91ab0de75276987ff381250d8a372515d7ae99c67c695925bdb"} Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.230307 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.337835 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util\") pod \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.338042 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsvh7\" (UniqueName: \"kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7\") pod \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.338143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle\") pod \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\" (UID: \"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a\") " Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.339130 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle" (OuterVolumeSpecName: "bundle") pod "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" (UID: "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.344953 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7" (OuterVolumeSpecName: "kube-api-access-hsvh7") pod "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" (UID: "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a"). InnerVolumeSpecName "kube-api-access-hsvh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.358710 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util" (OuterVolumeSpecName: "util") pod "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" (UID: "aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.440249 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.440300 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.440312 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsvh7\" (UniqueName: \"kubernetes.io/projected/aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a-kube-api-access-hsvh7\") on node \"crc\" DevicePath \"\"" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.974993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" event={"ID":"aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a","Type":"ContainerDied","Data":"fbea1bfe0f61bd424c5fde0a2da3ab28d94e6d218316bda074c25f44bb635815"} Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.975030 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbea1bfe0f61bd424c5fde0a2da3ab28d94e6d218316bda074c25f44bb635815" Dec 03 22:20:46 crc kubenswrapper[4830]: I1203 22:20:46.975094 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.569767 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg"] Dec 03 22:20:57 crc kubenswrapper[4830]: E1203 22:20:57.570607 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="extract" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.570622 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="extract" Dec 03 22:20:57 crc kubenswrapper[4830]: E1203 22:20:57.570638 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="util" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.570645 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="util" Dec 03 22:20:57 crc kubenswrapper[4830]: E1203 22:20:57.570660 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="pull" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.570667 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="pull" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.570780 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a" containerName="extract" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.571254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.574237 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.574325 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.574638 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.574433 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.575474 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zkr52" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.592947 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg"] Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.691649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7gk\" (UniqueName: \"kubernetes.io/projected/3898c0c7-fe9f-4446-803f-01d5c019b406-kube-api-access-vz7gk\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.691705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-apiservice-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.691838 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-webhook-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.793377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-webhook-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.793436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7gk\" (UniqueName: \"kubernetes.io/projected/3898c0c7-fe9f-4446-803f-01d5c019b406-kube-api-access-vz7gk\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.793455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-apiservice-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.799801 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-apiservice-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.801595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3898c0c7-fe9f-4446-803f-01d5c019b406-webhook-cert\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.818700 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7gk\" (UniqueName: \"kubernetes.io/projected/3898c0c7-fe9f-4446-803f-01d5c019b406-kube-api-access-vz7gk\") pod \"metallb-operator-controller-manager-6546545fd8-lr9kg\" (UID: \"3898c0c7-fe9f-4446-803f-01d5c019b406\") " pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.893891 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.937319 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt"] Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.938236 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.939972 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.940139 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lg9g8" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.945670 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.953916 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt"] Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.995817 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-webhook-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.995890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-apiservice-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:57 crc kubenswrapper[4830]: I1203 22:20:57.995922 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxrw\" (UniqueName: \"kubernetes.io/projected/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-kube-api-access-qxxrw\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.100938 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-apiservice-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.101219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxrw\" (UniqueName: \"kubernetes.io/projected/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-kube-api-access-qxxrw\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.101327 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-webhook-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.105858 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-webhook-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.107970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-apiservice-cert\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.128267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxrw\" (UniqueName: \"kubernetes.io/projected/a308b4ff-4a2a-4ab4-9171-9dc572b71c29-kube-api-access-qxxrw\") pod \"metallb-operator-webhook-server-79d7c9f765-w2ltt\" (UID: \"a308b4ff-4a2a-4ab4-9171-9dc572b71c29\") " pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.274355 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.416032 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg"] Dec 03 22:20:58 crc kubenswrapper[4830]: W1203 22:20:58.433661 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3898c0c7_fe9f_4446_803f_01d5c019b406.slice/crio-6f42b1b20e0af4b43c14f4c7d5e37dbc773bf12c75f6f03a0d20777867d3498a WatchSource:0}: Error finding container 6f42b1b20e0af4b43c14f4c7d5e37dbc773bf12c75f6f03a0d20777867d3498a: Status 404 returned error can't find the container with id 6f42b1b20e0af4b43c14f4c7d5e37dbc773bf12c75f6f03a0d20777867d3498a Dec 03 22:20:58 crc kubenswrapper[4830]: I1203 22:20:58.489335 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt"] Dec 03 22:20:58 crc kubenswrapper[4830]: W1203 22:20:58.495904 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda308b4ff_4a2a_4ab4_9171_9dc572b71c29.slice/crio-7d355f135c119c6abfee329417a4ca87266cb2b7439f155134613e5f5f8bfa81 WatchSource:0}: Error finding container 7d355f135c119c6abfee329417a4ca87266cb2b7439f155134613e5f5f8bfa81: Status 404 returned error can't find the container with id 7d355f135c119c6abfee329417a4ca87266cb2b7439f155134613e5f5f8bfa81 Dec 03 22:20:59 crc kubenswrapper[4830]: I1203 22:20:59.050249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" event={"ID":"a308b4ff-4a2a-4ab4-9171-9dc572b71c29","Type":"ContainerStarted","Data":"7d355f135c119c6abfee329417a4ca87266cb2b7439f155134613e5f5f8bfa81"} Dec 03 22:20:59 crc kubenswrapper[4830]: I1203 22:20:59.053467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" event={"ID":"3898c0c7-fe9f-4446-803f-01d5c019b406","Type":"ContainerStarted","Data":"6f42b1b20e0af4b43c14f4c7d5e37dbc773bf12c75f6f03a0d20777867d3498a"} Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.118966 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" event={"ID":"a308b4ff-4a2a-4ab4-9171-9dc572b71c29","Type":"ContainerStarted","Data":"4c98eb37de14094090891740249964d1ab59858e38def3043c5f5e256aa75101"} Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.119469 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.121266 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" event={"ID":"3898c0c7-fe9f-4446-803f-01d5c019b406","Type":"ContainerStarted","Data":"aeafa952d3a39b9ff29c864c3b4ba0ba87a7e39530aff7f9b3ce4d40fd01adb3"} Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.121446 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.140155 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" podStartSLOduration=2.489902738 podStartE2EDuration="8.140137282s" podCreationTimestamp="2025-12-03 22:20:57 +0000 UTC" firstStartedPulling="2025-12-03 22:20:58.498790869 +0000 UTC m=+947.495252218" lastFinishedPulling="2025-12-03 22:21:04.149025413 +0000 UTC m=+953.145486762" observedRunningTime="2025-12-03 22:21:05.139373772 +0000 UTC m=+954.135835201" watchObservedRunningTime="2025-12-03 22:21:05.140137282 +0000 UTC m=+954.136598631" Dec 03 22:21:05 crc kubenswrapper[4830]: I1203 22:21:05.171374 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" podStartSLOduration=2.479656761 podStartE2EDuration="8.171339978s" podCreationTimestamp="2025-12-03 22:20:57 +0000 UTC" firstStartedPulling="2025-12-03 22:20:58.437094538 +0000 UTC m=+947.433555887" lastFinishedPulling="2025-12-03 22:21:04.128777755 +0000 UTC m=+953.125239104" observedRunningTime="2025-12-03 22:21:05.168544032 +0000 UTC m=+954.165005391" watchObservedRunningTime="2025-12-03 22:21:05.171339978 +0000 UTC m=+954.167801377" Dec 03 22:21:18 crc kubenswrapper[4830]: I1203 22:21:18.279755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79d7c9f765-w2ltt" Dec 03 22:21:26 crc kubenswrapper[4830]: I1203 22:21:26.681728 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:21:26 crc kubenswrapper[4830]: I1203 22:21:26.683688 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:21:37 crc kubenswrapper[4830]: I1203 22:21:37.898724 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6546545fd8-lr9kg" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.733180 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-467ht"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.738840 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.749857 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.751885 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-btcd2" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.758875 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.759813 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.760965 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.763806 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.774808 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781405 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-startup\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781463 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-conf\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781489 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics-certs\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781550 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5wq\" (UniqueName: \"kubernetes.io/projected/ea33811b-aa63-48d4-b9f3-9446ca919e3c-kube-api-access-wm5wq\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781603 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdjq\" (UniqueName: \"kubernetes.io/projected/947e6799-183e-4ab9-8dac-bc02f1232c6e-kube-api-access-zzdjq\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781694 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-sockets\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.781715 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-reloader\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.854873 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4q9rp"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.855797 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.857388 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rk2j8" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.860972 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.861131 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.861202 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.885087 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-cc4b2"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.885964 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics-certs\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887066 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5rh9\" (UniqueName: \"kubernetes.io/projected/194dbe3b-afec-4576-b9ac-8810ad9c9482-kube-api-access-d5rh9\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5wq\" (UniqueName: \"kubernetes.io/projected/ea33811b-aa63-48d4-b9f3-9446ca919e3c-kube-api-access-wm5wq\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887158 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdjq\" (UniqueName: \"kubernetes.io/projected/947e6799-183e-4ab9-8dac-bc02f1232c6e-kube-api-access-zzdjq\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.887188 4830 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.887438 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert podName:ea33811b-aa63-48d4-b9f3-9446ca919e3c nodeName:}" failed. No retries permitted until 2025-12-03 22:21:39.387417968 +0000 UTC m=+988.383879317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert") pod "frr-k8s-webhook-server-7fcb986d4-dj5pn" (UID: "ea33811b-aa63-48d4-b9f3-9446ca919e3c") : secret "frr-k8s-webhook-server-cert" not found Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887197 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/194dbe3b-afec-4576-b9ac-8810ad9c9482-metallb-excludel2\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887677 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-sockets\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-reloader\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887757 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887778 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-startup\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887801 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-conf\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.887861 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.888010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-sockets\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.888058 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-conf\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.888733 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/947e6799-183e-4ab9-8dac-bc02f1232c6e-frr-startup\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.889795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/947e6799-183e-4ab9-8dac-bc02f1232c6e-reloader\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.893701 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.907412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/947e6799-183e-4ab9-8dac-bc02f1232c6e-metrics-certs\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.924891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5wq\" (UniqueName: \"kubernetes.io/projected/ea33811b-aa63-48d4-b9f3-9446ca919e3c-kube-api-access-wm5wq\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.944629 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-cc4b2"] Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.954313 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdjq\" (UniqueName: \"kubernetes.io/projected/947e6799-183e-4ab9-8dac-bc02f1232c6e-kube-api-access-zzdjq\") pod \"frr-k8s-467ht\" (UID: \"947e6799-183e-4ab9-8dac-bc02f1232c6e\") " pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5rh9\" (UniqueName: \"kubernetes.io/projected/194dbe3b-afec-4576-b9ac-8810ad9c9482-kube-api-access-d5rh9\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992163 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992206 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/194dbe3b-afec-4576-b9ac-8810ad9c9482-metallb-excludel2\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdt6\" (UniqueName: \"kubernetes.io/projected/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-kube-api-access-frdt6\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992267 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.992287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-cert\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.992637 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.992678 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist podName:194dbe3b-afec-4576-b9ac-8810ad9c9482 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:39.492664894 +0000 UTC m=+988.489126243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist") pod "speaker-4q9rp" (UID: "194dbe3b-afec-4576-b9ac-8810ad9c9482") : secret "metallb-memberlist" not found Dec 03 22:21:38 crc kubenswrapper[4830]: I1203 22:21:38.993291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/194dbe3b-afec-4576-b9ac-8810ad9c9482-metallb-excludel2\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.993349 4830 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 22:21:38 crc kubenswrapper[4830]: E1203 22:21:38.993370 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs podName:194dbe3b-afec-4576-b9ac-8810ad9c9482 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:39.493363773 +0000 UTC m=+988.489825112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs") pod "speaker-4q9rp" (UID: "194dbe3b-afec-4576-b9ac-8810ad9c9482") : secret "speaker-certs-secret" not found Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.013041 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5rh9\" (UniqueName: \"kubernetes.io/projected/194dbe3b-afec-4576-b9ac-8810ad9c9482-kube-api-access-d5rh9\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.059668 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.093072 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-cert\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.093464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: E1203 22:21:39.093607 4830 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 22:21:39 crc kubenswrapper[4830]: E1203 22:21:39.093678 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs podName:36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d nodeName:}" failed. No retries permitted until 2025-12-03 22:21:39.593661014 +0000 UTC m=+988.590122363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs") pod "controller-f8648f98b-cc4b2" (UID: "36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d") : secret "controller-certs-secret" not found Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.093895 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdt6\" (UniqueName: \"kubernetes.io/projected/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-kube-api-access-frdt6\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.096036 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.109834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-cert\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.114372 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdt6\" (UniqueName: \"kubernetes.io/projected/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-kube-api-access-frdt6\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.358855 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"6d6fc22eab2b385849e84487a96a18231e0d8b4ac4a8a327cc84205ef3ccb0e3"} Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.397147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.400921 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea33811b-aa63-48d4-b9f3-9446ca919e3c-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-dj5pn\" (UID: \"ea33811b-aa63-48d4-b9f3-9446ca919e3c\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.498127 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.498218 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:39 crc kubenswrapper[4830]: E1203 22:21:39.498328 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 22:21:39 crc kubenswrapper[4830]: E1203 22:21:39.498384 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist podName:194dbe3b-afec-4576-b9ac-8810ad9c9482 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:40.498369486 +0000 UTC m=+989.494830835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist") pod "speaker-4q9rp" (UID: "194dbe3b-afec-4576-b9ac-8810ad9c9482") : secret "metallb-memberlist" not found Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.502010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-metrics-certs\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.599414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.604970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d-metrics-certs\") pod \"controller-f8648f98b-cc4b2\" (UID: \"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d\") " pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.681682 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.874954 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:39 crc kubenswrapper[4830]: I1203 22:21:39.981041 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn"] Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.108135 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-cc4b2"] Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.368233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cc4b2" event={"ID":"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d","Type":"ContainerStarted","Data":"c06e76ca3ead63e86d8207aa8d82dd4c49376e5f6a307ac3158a185cdd97f9cb"} Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.368679 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cc4b2" event={"ID":"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d","Type":"ContainerStarted","Data":"4f84267ac4f28a7b5ec3f072094afb206fd9914bf26358c58032e1039b603dcb"} Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.370080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" event={"ID":"ea33811b-aa63-48d4-b9f3-9446ca919e3c","Type":"ContainerStarted","Data":"238540c486cfe3e9604f61e6edb31e88ab740f71e5d0d32a88ebb69aaad4b164"} Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.523435 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.532025 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/194dbe3b-afec-4576-b9ac-8810ad9c9482-memberlist\") pod \"speaker-4q9rp\" (UID: \"194dbe3b-afec-4576-b9ac-8810ad9c9482\") " pod="metallb-system/speaker-4q9rp" Dec 03 22:21:40 crc kubenswrapper[4830]: I1203 22:21:40.667792 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4q9rp" Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.377586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cc4b2" event={"ID":"36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d","Type":"ContainerStarted","Data":"8dd0e7e1404c930dde52ebab3bac95b8d85b0c93189b998516f6f362b34cdc12"} Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.378923 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.380371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q9rp" event={"ID":"194dbe3b-afec-4576-b9ac-8810ad9c9482","Type":"ContainerStarted","Data":"0389ce7326390524d9128b8c4362d90ea4597b989f59ed1a5af9b7dc8e4400b9"} Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.380415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q9rp" event={"ID":"194dbe3b-afec-4576-b9ac-8810ad9c9482","Type":"ContainerStarted","Data":"6e7fc8430f1922dd7ef252a2ce83327944941eec940ff1fa6893eff0bed5ca94"} Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.380428 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q9rp" event={"ID":"194dbe3b-afec-4576-b9ac-8810ad9c9482","Type":"ContainerStarted","Data":"51dba6fd63b006773ddc6324270a95b35e1900d5abc203b1344eacc4170c37c1"} Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.380691 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4q9rp" Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.399013 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4q9rp" podStartSLOduration=3.398993411 podStartE2EDuration="3.398993411s" podCreationTimestamp="2025-12-03 22:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:21:41.398174419 +0000 UTC m=+990.394635768" watchObservedRunningTime="2025-12-03 22:21:41.398993411 +0000 UTC m=+990.395454760" Dec 03 22:21:41 crc kubenswrapper[4830]: I1203 22:21:41.414718 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-cc4b2" podStartSLOduration=3.414693321 podStartE2EDuration="3.414693321s" podCreationTimestamp="2025-12-03 22:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:21:41.410766813 +0000 UTC m=+990.407228162" watchObservedRunningTime="2025-12-03 22:21:41.414693321 +0000 UTC m=+990.411154670" Dec 03 22:21:47 crc kubenswrapper[4830]: I1203 22:21:47.434547 4830 generic.go:334] "Generic (PLEG): container finished" podID="947e6799-183e-4ab9-8dac-bc02f1232c6e" containerID="de2cf1dbed00b18d401ff7237ebf9e3f02d05243f61d1750f864a2ccd7e62a63" exitCode=0 Dec 03 22:21:47 crc kubenswrapper[4830]: I1203 22:21:47.434658 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerDied","Data":"de2cf1dbed00b18d401ff7237ebf9e3f02d05243f61d1750f864a2ccd7e62a63"} Dec 03 22:21:47 crc kubenswrapper[4830]: I1203 22:21:47.437959 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" event={"ID":"ea33811b-aa63-48d4-b9f3-9446ca919e3c","Type":"ContainerStarted","Data":"d1260f32a1310f74b3afdfba64a25010ea4567c9a21437db243aedd970094d34"} Dec 03 22:21:47 crc kubenswrapper[4830]: I1203 22:21:47.438259 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:47 crc kubenswrapper[4830]: I1203 22:21:47.519447 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" podStartSLOduration=2.94489062 podStartE2EDuration="9.519409398s" podCreationTimestamp="2025-12-03 22:21:38 +0000 UTC" firstStartedPulling="2025-12-03 22:21:40.000121569 +0000 UTC m=+988.996582918" lastFinishedPulling="2025-12-03 22:21:46.574640307 +0000 UTC m=+995.571101696" observedRunningTime="2025-12-03 22:21:47.503163414 +0000 UTC m=+996.499624803" watchObservedRunningTime="2025-12-03 22:21:47.519409398 +0000 UTC m=+996.515870797" Dec 03 22:21:48 crc kubenswrapper[4830]: I1203 22:21:48.452634 4830 generic.go:334] "Generic (PLEG): container finished" podID="947e6799-183e-4ab9-8dac-bc02f1232c6e" containerID="55a9a45eacf5c9290ae02a26b9332814c7c63a2469c6585b46fccb5a772b2e61" exitCode=0 Dec 03 22:21:48 crc kubenswrapper[4830]: I1203 22:21:48.452770 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerDied","Data":"55a9a45eacf5c9290ae02a26b9332814c7c63a2469c6585b46fccb5a772b2e61"} Dec 03 22:21:49 crc kubenswrapper[4830]: I1203 22:21:49.463029 4830 generic.go:334] "Generic (PLEG): container finished" podID="947e6799-183e-4ab9-8dac-bc02f1232c6e" containerID="f998dd2eca5cc4d5f8637cdfa6b925087913df4512901f45709103047c3a4333" exitCode=0 Dec 03 22:21:49 crc kubenswrapper[4830]: I1203 22:21:49.463093 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerDied","Data":"f998dd2eca5cc4d5f8637cdfa6b925087913df4512901f45709103047c3a4333"} Dec 03 22:21:50 crc kubenswrapper[4830]: I1203 22:21:50.474819 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"a44f7a9deaa6fb64ad8191fa118913718eb9f9be2b24d79e807956f8f4c69abd"} Dec 03 22:21:50 crc kubenswrapper[4830]: I1203 22:21:50.475166 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"d5fb80d8816a9333a8dc0df4ec5fe1810e7b54522fef6e00054e106b0c019996"} Dec 03 22:21:50 crc kubenswrapper[4830]: I1203 22:21:50.475178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"737011bf0750dd18a8b736fa09155556512870e53680e4b152bb666eda24670c"} Dec 03 22:21:50 crc kubenswrapper[4830]: I1203 22:21:50.475187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"0024a509b0fd43509ebcb09318c186dc2922c1e9e1ec18dfac4e547fddc9eff9"} Dec 03 22:21:50 crc kubenswrapper[4830]: I1203 22:21:50.475196 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"d0d4617685dc0463adc4bbf501d8b152b9352c4cca3c8a69be2c3f5d38cc783b"} Dec 03 22:21:51 crc kubenswrapper[4830]: I1203 22:21:51.487232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-467ht" event={"ID":"947e6799-183e-4ab9-8dac-bc02f1232c6e","Type":"ContainerStarted","Data":"03affa6e128e8a17212ccf065fdc579f46c7e89872776f6079f43d3c3ae5e3cc"} Dec 03 22:21:51 crc kubenswrapper[4830]: I1203 22:21:51.487610 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:51 crc kubenswrapper[4830]: I1203 22:21:51.511500 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-467ht" podStartSLOduration=6.119919856 podStartE2EDuration="13.511478645s" podCreationTimestamp="2025-12-03 22:21:38 +0000 UTC" firstStartedPulling="2025-12-03 22:21:39.200056492 +0000 UTC m=+988.196517841" lastFinishedPulling="2025-12-03 22:21:46.591615231 +0000 UTC m=+995.588076630" observedRunningTime="2025-12-03 22:21:51.509869011 +0000 UTC m=+1000.506330370" watchObservedRunningTime="2025-12-03 22:21:51.511478645 +0000 UTC m=+1000.507940004" Dec 03 22:21:54 crc kubenswrapper[4830]: I1203 22:21:54.060194 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:54 crc kubenswrapper[4830]: I1203 22:21:54.129163 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:56 crc kubenswrapper[4830]: I1203 22:21:56.681711 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:21:56 crc kubenswrapper[4830]: I1203 22:21:56.682090 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:21:59 crc kubenswrapper[4830]: I1203 22:21:59.067431 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-467ht" Dec 03 22:21:59 crc kubenswrapper[4830]: I1203 22:21:59.693471 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-dj5pn" Dec 03 22:21:59 crc kubenswrapper[4830]: I1203 22:21:59.881007 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-cc4b2" Dec 03 22:22:00 crc kubenswrapper[4830]: I1203 22:22:00.673450 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4q9rp" Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.890096 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.897290 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.902603 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.903527 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.906700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2b64n" Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.915805 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:03 crc kubenswrapper[4830]: I1203 22:22:03.971932 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vfr\" (UniqueName: \"kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr\") pod \"openstack-operator-index-njtgs\" (UID: \"3537d62c-f1b7-4eb0-aef3-ce6b305aa607\") " pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:04 crc kubenswrapper[4830]: I1203 22:22:04.074273 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vfr\" (UniqueName: \"kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr\") pod \"openstack-operator-index-njtgs\" (UID: \"3537d62c-f1b7-4eb0-aef3-ce6b305aa607\") " pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:04 crc kubenswrapper[4830]: I1203 22:22:04.099381 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vfr\" (UniqueName: \"kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr\") pod \"openstack-operator-index-njtgs\" (UID: \"3537d62c-f1b7-4eb0-aef3-ce6b305aa607\") " pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:04 crc kubenswrapper[4830]: I1203 22:22:04.224260 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:04 crc kubenswrapper[4830]: I1203 22:22:04.633711 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:04 crc kubenswrapper[4830]: I1203 22:22:04.648584 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:22:05 crc kubenswrapper[4830]: I1203 22:22:05.602895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtgs" event={"ID":"3537d62c-f1b7-4eb0-aef3-ce6b305aa607","Type":"ContainerStarted","Data":"082d8196768bfa83fea548da0935826e783bc94222135b4d864b9277fa0ced16"} Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.057692 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.661654 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pm2p5"] Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.663363 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.671646 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pm2p5"] Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.713972 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncnv\" (UniqueName: \"kubernetes.io/projected/e3d5735a-003c-4c35-9239-c80e7e6dbafc-kube-api-access-lncnv\") pod \"openstack-operator-index-pm2p5\" (UID: \"e3d5735a-003c-4c35-9239-c80e7e6dbafc\") " pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.814990 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncnv\" (UniqueName: \"kubernetes.io/projected/e3d5735a-003c-4c35-9239-c80e7e6dbafc-kube-api-access-lncnv\") pod \"openstack-operator-index-pm2p5\" (UID: \"e3d5735a-003c-4c35-9239-c80e7e6dbafc\") " pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.849637 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncnv\" (UniqueName: \"kubernetes.io/projected/e3d5735a-003c-4c35-9239-c80e7e6dbafc-kube-api-access-lncnv\") pod \"openstack-operator-index-pm2p5\" (UID: \"e3d5735a-003c-4c35-9239-c80e7e6dbafc\") " pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:06 crc kubenswrapper[4830]: I1203 22:22:06.993497 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:07 crc kubenswrapper[4830]: I1203 22:22:07.922939 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pm2p5"] Dec 03 22:22:08 crc kubenswrapper[4830]: W1203 22:22:08.465652 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d5735a_003c_4c35_9239_c80e7e6dbafc.slice/crio-67f644d75b345d7745666f0204bd2cc7076586fcaa4f0e54bf0f76b7f67aae70 WatchSource:0}: Error finding container 67f644d75b345d7745666f0204bd2cc7076586fcaa4f0e54bf0f76b7f67aae70: Status 404 returned error can't find the container with id 67f644d75b345d7745666f0204bd2cc7076586fcaa4f0e54bf0f76b7f67aae70 Dec 03 22:22:08 crc kubenswrapper[4830]: I1203 22:22:08.625931 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pm2p5" event={"ID":"e3d5735a-003c-4c35-9239-c80e7e6dbafc","Type":"ContainerStarted","Data":"67f644d75b345d7745666f0204bd2cc7076586fcaa4f0e54bf0f76b7f67aae70"} Dec 03 22:22:09 crc kubenswrapper[4830]: I1203 22:22:09.636087 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pm2p5" event={"ID":"e3d5735a-003c-4c35-9239-c80e7e6dbafc","Type":"ContainerStarted","Data":"7ae58597974bb063821edea996de8313c1e9cf0a1a856ee54df0e79adad2d70e"} Dec 03 22:22:09 crc kubenswrapper[4830]: I1203 22:22:09.639051 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtgs" event={"ID":"3537d62c-f1b7-4eb0-aef3-ce6b305aa607","Type":"ContainerStarted","Data":"901a5ab80ab81a7b281dc149e825620bef6287d3e6876c4652ef7fe15622dd24"} Dec 03 22:22:09 crc kubenswrapper[4830]: I1203 22:22:09.639306 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-njtgs" podUID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" containerName="registry-server" containerID="cri-o://901a5ab80ab81a7b281dc149e825620bef6287d3e6876c4652ef7fe15622dd24" gracePeriod=2 Dec 03 22:22:09 crc kubenswrapper[4830]: I1203 22:22:09.672669 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pm2p5" podStartSLOduration=3.60136319 podStartE2EDuration="3.672642978s" podCreationTimestamp="2025-12-03 22:22:06 +0000 UTC" firstStartedPulling="2025-12-03 22:22:08.467203733 +0000 UTC m=+1017.463665082" lastFinishedPulling="2025-12-03 22:22:08.538483531 +0000 UTC m=+1017.534944870" observedRunningTime="2025-12-03 22:22:09.662022988 +0000 UTC m=+1018.658484387" watchObservedRunningTime="2025-12-03 22:22:09.672642978 +0000 UTC m=+1018.669104357" Dec 03 22:22:09 crc kubenswrapper[4830]: I1203 22:22:09.705265 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-njtgs" podStartSLOduration=2.850928136 podStartE2EDuration="6.705221238s" podCreationTimestamp="2025-12-03 22:22:03 +0000 UTC" firstStartedPulling="2025-12-03 22:22:04.648300698 +0000 UTC m=+1013.644762047" lastFinishedPulling="2025-12-03 22:22:08.5025938 +0000 UTC m=+1017.499055149" observedRunningTime="2025-12-03 22:22:09.70196788 +0000 UTC m=+1018.698429309" watchObservedRunningTime="2025-12-03 22:22:09.705221238 +0000 UTC m=+1018.701682607" Dec 03 22:22:10 crc kubenswrapper[4830]: I1203 22:22:10.649151 4830 generic.go:334] "Generic (PLEG): container finished" podID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" containerID="901a5ab80ab81a7b281dc149e825620bef6287d3e6876c4652ef7fe15622dd24" exitCode=0 Dec 03 22:22:10 crc kubenswrapper[4830]: I1203 22:22:10.649649 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtgs" event={"ID":"3537d62c-f1b7-4eb0-aef3-ce6b305aa607","Type":"ContainerDied","Data":"901a5ab80ab81a7b281dc149e825620bef6287d3e6876c4652ef7fe15622dd24"} Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.256750 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.389080 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2vfr\" (UniqueName: \"kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr\") pod \"3537d62c-f1b7-4eb0-aef3-ce6b305aa607\" (UID: \"3537d62c-f1b7-4eb0-aef3-ce6b305aa607\") " Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.395141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr" (OuterVolumeSpecName: "kube-api-access-b2vfr") pod "3537d62c-f1b7-4eb0-aef3-ce6b305aa607" (UID: "3537d62c-f1b7-4eb0-aef3-ce6b305aa607"). InnerVolumeSpecName "kube-api-access-b2vfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.491414 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2vfr\" (UniqueName: \"kubernetes.io/projected/3537d62c-f1b7-4eb0-aef3-ce6b305aa607-kube-api-access-b2vfr\") on node \"crc\" DevicePath \"\"" Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.662467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-njtgs" event={"ID":"3537d62c-f1b7-4eb0-aef3-ce6b305aa607","Type":"ContainerDied","Data":"082d8196768bfa83fea548da0935826e783bc94222135b4d864b9277fa0ced16"} Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.662568 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-njtgs" Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.663719 4830 scope.go:117] "RemoveContainer" containerID="901a5ab80ab81a7b281dc149e825620bef6287d3e6876c4652ef7fe15622dd24" Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.716705 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:11 crc kubenswrapper[4830]: I1203 22:22:11.725805 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-njtgs"] Dec 03 22:22:13 crc kubenswrapper[4830]: I1203 22:22:13.351548 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" path="/var/lib/kubelet/pods/3537d62c-f1b7-4eb0-aef3-ce6b305aa607/volumes" Dec 03 22:22:16 crc kubenswrapper[4830]: I1203 22:22:16.993784 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:16 crc kubenswrapper[4830]: I1203 22:22:16.994712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:17 crc kubenswrapper[4830]: I1203 22:22:17.041086 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:17 crc kubenswrapper[4830]: I1203 22:22:17.745610 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-pm2p5" Dec 03 22:22:24 crc kubenswrapper[4830]: I1203 22:22:24.991246 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9"] Dec 03 22:22:24 crc kubenswrapper[4830]: E1203 22:22:24.992229 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" containerName="registry-server" Dec 03 22:22:24 crc kubenswrapper[4830]: I1203 22:22:24.992252 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" containerName="registry-server" Dec 03 22:22:24 crc kubenswrapper[4830]: I1203 22:22:24.992476 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3537d62c-f1b7-4eb0-aef3-ce6b305aa607" containerName="registry-server" Dec 03 22:22:24 crc kubenswrapper[4830]: I1203 22:22:24.994237 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:24 crc kubenswrapper[4830]: I1203 22:22:24.997919 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hldzj" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.005334 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9"] Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.043082 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.043147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.043186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f85m7\" (UniqueName: \"kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.144376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.144445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.144485 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f85m7\" (UniqueName: \"kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.145205 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.146829 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.178809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f85m7\" (UniqueName: \"kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7\") pod \"d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.319830 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:25 crc kubenswrapper[4830]: I1203 22:22:25.855566 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9"] Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.682199 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.682635 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.682727 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.683876 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.683980 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59" gracePeriod=600 Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.798981 4830 generic.go:334] "Generic (PLEG): container finished" podID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerID="4a71cab8155f422f7224b16c41bf0b3f832f2daae33244db1a550b68784b2129" exitCode=0 Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.799096 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" event={"ID":"53e3bc06-3d65-43f4-a54f-638f4871c97f","Type":"ContainerDied","Data":"4a71cab8155f422f7224b16c41bf0b3f832f2daae33244db1a550b68784b2129"} Dec 03 22:22:26 crc kubenswrapper[4830]: I1203 22:22:26.799991 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" event={"ID":"53e3bc06-3d65-43f4-a54f-638f4871c97f","Type":"ContainerStarted","Data":"8508128d20815aa0a9450cb49c6592dc73ae1d69341989a1bd94e143636e495f"} Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.812186 4830 generic.go:334] "Generic (PLEG): container finished" podID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerID="7e382f73e418e9d0731894836347c2494112f77aa829f1e0468e541b902454ee" exitCode=0 Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.812288 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" event={"ID":"53e3bc06-3d65-43f4-a54f-638f4871c97f","Type":"ContainerDied","Data":"7e382f73e418e9d0731894836347c2494112f77aa829f1e0468e541b902454ee"} Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.819108 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59" exitCode=0 Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.819165 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59"} Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.819203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453"} Dec 03 22:22:27 crc kubenswrapper[4830]: I1203 22:22:27.819231 4830 scope.go:117] "RemoveContainer" containerID="171c35f7222805b1cd5f3e37402f9fb3e95a2ab3a3d5d02e3ef17eeb4f38ce9b" Dec 03 22:22:28 crc kubenswrapper[4830]: I1203 22:22:28.829717 4830 generic.go:334] "Generic (PLEG): container finished" podID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerID="c512f27cb2149b8cded2bcfed0454001f7f58ed9f76fb60a78b49d4e30836f8a" exitCode=0 Dec 03 22:22:28 crc kubenswrapper[4830]: I1203 22:22:28.829858 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" event={"ID":"53e3bc06-3d65-43f4-a54f-638f4871c97f","Type":"ContainerDied","Data":"c512f27cb2149b8cded2bcfed0454001f7f58ed9f76fb60a78b49d4e30836f8a"} Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.175295 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.320492 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f85m7\" (UniqueName: \"kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7\") pod \"53e3bc06-3d65-43f4-a54f-638f4871c97f\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.320674 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util\") pod \"53e3bc06-3d65-43f4-a54f-638f4871c97f\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.320766 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle\") pod \"53e3bc06-3d65-43f4-a54f-638f4871c97f\" (UID: \"53e3bc06-3d65-43f4-a54f-638f4871c97f\") " Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.321930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle" (OuterVolumeSpecName: "bundle") pod "53e3bc06-3d65-43f4-a54f-638f4871c97f" (UID: "53e3bc06-3d65-43f4-a54f-638f4871c97f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.329600 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7" (OuterVolumeSpecName: "kube-api-access-f85m7") pod "53e3bc06-3d65-43f4-a54f-638f4871c97f" (UID: "53e3bc06-3d65-43f4-a54f-638f4871c97f"). InnerVolumeSpecName "kube-api-access-f85m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.353398 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util" (OuterVolumeSpecName: "util") pod "53e3bc06-3d65-43f4-a54f-638f4871c97f" (UID: "53e3bc06-3d65-43f4-a54f-638f4871c97f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.423175 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f85m7\" (UniqueName: \"kubernetes.io/projected/53e3bc06-3d65-43f4-a54f-638f4871c97f-kube-api-access-f85m7\") on node \"crc\" DevicePath \"\"" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.423471 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-util\") on node \"crc\" DevicePath \"\"" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.423484 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e3bc06-3d65-43f4-a54f-638f4871c97f-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.859753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" event={"ID":"53e3bc06-3d65-43f4-a54f-638f4871c97f","Type":"ContainerDied","Data":"8508128d20815aa0a9450cb49c6592dc73ae1d69341989a1bd94e143636e495f"} Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.859832 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8508128d20815aa0a9450cb49c6592dc73ae1d69341989a1bd94e143636e495f" Dec 03 22:22:30 crc kubenswrapper[4830]: I1203 22:22:30.860033 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.169966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq"] Dec 03 22:22:37 crc kubenswrapper[4830]: E1203 22:22:37.172003 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="util" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.172092 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="util" Dec 03 22:22:37 crc kubenswrapper[4830]: E1203 22:22:37.172163 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="pull" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.172234 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="pull" Dec 03 22:22:37 crc kubenswrapper[4830]: E1203 22:22:37.172305 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="extract" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.172376 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="extract" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.172834 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e3bc06-3d65-43f4-a54f-638f4871c97f" containerName="extract" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.173531 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.176658 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-xmc57" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.206879 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq"] Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.323236 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq64n\" (UniqueName: \"kubernetes.io/projected/dd670738-8808-4f9b-8fea-98c4ab57fb06-kube-api-access-lq64n\") pod \"openstack-operator-controller-operator-8545cc8fb-sm7mq\" (UID: \"dd670738-8808-4f9b-8fea-98c4ab57fb06\") " pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.425198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq64n\" (UniqueName: \"kubernetes.io/projected/dd670738-8808-4f9b-8fea-98c4ab57fb06-kube-api-access-lq64n\") pod \"openstack-operator-controller-operator-8545cc8fb-sm7mq\" (UID: \"dd670738-8808-4f9b-8fea-98c4ab57fb06\") " pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.447364 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq64n\" (UniqueName: \"kubernetes.io/projected/dd670738-8808-4f9b-8fea-98c4ab57fb06-kube-api-access-lq64n\") pod \"openstack-operator-controller-operator-8545cc8fb-sm7mq\" (UID: \"dd670738-8808-4f9b-8fea-98c4ab57fb06\") " pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.492969 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.763299 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq"] Dec 03 22:22:37 crc kubenswrapper[4830]: W1203 22:22:37.769837 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd670738_8808_4f9b_8fea_98c4ab57fb06.slice/crio-fb31c3b516ebb64869e6719f8f8f513770515a485a3fae0acd1b7df7985c7a97 WatchSource:0}: Error finding container fb31c3b516ebb64869e6719f8f8f513770515a485a3fae0acd1b7df7985c7a97: Status 404 returned error can't find the container with id fb31c3b516ebb64869e6719f8f8f513770515a485a3fae0acd1b7df7985c7a97 Dec 03 22:22:37 crc kubenswrapper[4830]: I1203 22:22:37.910147 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" event={"ID":"dd670738-8808-4f9b-8fea-98c4ab57fb06","Type":"ContainerStarted","Data":"fb31c3b516ebb64869e6719f8f8f513770515a485a3fae0acd1b7df7985c7a97"} Dec 03 22:22:42 crc kubenswrapper[4830]: I1203 22:22:42.950929 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" event={"ID":"dd670738-8808-4f9b-8fea-98c4ab57fb06","Type":"ContainerStarted","Data":"4ee90f7d4cfce255208839b45552d5c5728daa023cc12fbbafd520b5e83ce8f3"} Dec 03 22:22:42 crc kubenswrapper[4830]: I1203 22:22:42.951490 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:22:42 crc kubenswrapper[4830]: I1203 22:22:42.991236 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" podStartSLOduration=1.639871085 podStartE2EDuration="5.991203956s" podCreationTimestamp="2025-12-03 22:22:37 +0000 UTC" firstStartedPulling="2025-12-03 22:22:37.772728324 +0000 UTC m=+1046.769189683" lastFinishedPulling="2025-12-03 22:22:42.124061205 +0000 UTC m=+1051.120522554" observedRunningTime="2025-12-03 22:22:42.977843293 +0000 UTC m=+1051.974304652" watchObservedRunningTime="2025-12-03 22:22:42.991203956 +0000 UTC m=+1051.987665345" Dec 03 22:22:47 crc kubenswrapper[4830]: I1203 22:22:47.497697 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-8545cc8fb-sm7mq" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.517771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.519370 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.521630 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-njtgk" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.524398 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.525624 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.527435 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5bgmh" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.532178 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.537619 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.547805 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.549333 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.552724 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-w8z4p" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.554960 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.555955 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.559024 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-phnxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.584167 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.586526 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbpw\" (UniqueName: \"kubernetes.io/projected/22f7d8a7-b9bf-40ca-aca3-13a370558f38-kube-api-access-5hbpw\") pod \"designate-operator-controller-manager-78b4bc895b-zgtdz\" (UID: \"22f7d8a7-b9bf-40ca-aca3-13a370558f38\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.586568 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnf9t\" (UniqueName: \"kubernetes.io/projected/0ecd210b-fb48-42fe-b161-6583d913b6f8-kube-api-access-gnf9t\") pod \"cinder-operator-controller-manager-859b6ccc6-rhhdr\" (UID: \"0ecd210b-fb48-42fe-b161-6583d913b6f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.586595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjvk\" (UniqueName: \"kubernetes.io/projected/cca79891-68e7-4827-8da4-c0570dbca762-kube-api-access-5jjvk\") pod \"barbican-operator-controller-manager-7d9dfd778-qnp24\" (UID: \"cca79891-68e7-4827-8da4-c0570dbca762\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.586625 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zdh\" (UniqueName: \"kubernetes.io/projected/e8bc8bcd-fde2-43fd-86ae-814182f2f5ac-kube-api-access-s6zdh\") pod \"glance-operator-controller-manager-77987cd8cd-qcz7k\" (UID: \"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.599586 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.609301 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.610782 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.619935 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-whhwn" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.626132 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.627239 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.630730 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vsmx4" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.643051 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.664331 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pd68x"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.665398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.667274 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pljj7" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.667474 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.684902 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnf9t\" (UniqueName: \"kubernetes.io/projected/0ecd210b-fb48-42fe-b161-6583d913b6f8-kube-api-access-gnf9t\") pod \"cinder-operator-controller-manager-859b6ccc6-rhhdr\" (UID: \"0ecd210b-fb48-42fe-b161-6583d913b6f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690365 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjvk\" (UniqueName: \"kubernetes.io/projected/cca79891-68e7-4827-8da4-c0570dbca762-kube-api-access-5jjvk\") pod \"barbican-operator-controller-manager-7d9dfd778-qnp24\" (UID: \"cca79891-68e7-4827-8da4-c0570dbca762\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690393 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl54d\" (UniqueName: \"kubernetes.io/projected/a5f4a0b7-d118-45b5-ab87-9f03413d4671-kube-api-access-zl54d\") pod \"horizon-operator-controller-manager-68c6d99b8f-jj85k\" (UID: \"a5f4a0b7-d118-45b5-ab87-9f03413d4671\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zdh\" (UniqueName: \"kubernetes.io/projected/e8bc8bcd-fde2-43fd-86ae-814182f2f5ac-kube-api-access-s6zdh\") pod \"glance-operator-controller-manager-77987cd8cd-qcz7k\" (UID: \"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690435 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c5h\" (UniqueName: \"kubernetes.io/projected/5e5620ec-6ef3-47fc-b88b-06a2f2849b48-kube-api-access-84c5h\") pod \"heat-operator-controller-manager-5f64f6f8bb-nggxg\" (UID: \"5e5620ec-6ef3-47fc-b88b-06a2f2849b48\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690471 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690488 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xjp7\" (UniqueName: \"kubernetes.io/projected/4cf3851e-6624-48c2-aa71-e799e6b6b685-kube-api-access-2xjp7\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.690537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbpw\" (UniqueName: \"kubernetes.io/projected/22f7d8a7-b9bf-40ca-aca3-13a370558f38-kube-api-access-5hbpw\") pod \"designate-operator-controller-manager-78b4bc895b-zgtdz\" (UID: \"22f7d8a7-b9bf-40ca-aca3-13a370558f38\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.691548 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pd68x"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.726011 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjvk\" (UniqueName: \"kubernetes.io/projected/cca79891-68e7-4827-8da4-c0570dbca762-kube-api-access-5jjvk\") pod \"barbican-operator-controller-manager-7d9dfd778-qnp24\" (UID: \"cca79891-68e7-4827-8da4-c0570dbca762\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.728658 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zdh\" (UniqueName: \"kubernetes.io/projected/e8bc8bcd-fde2-43fd-86ae-814182f2f5ac-kube-api-access-s6zdh\") pod \"glance-operator-controller-manager-77987cd8cd-qcz7k\" (UID: \"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.729118 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnf9t\" (UniqueName: \"kubernetes.io/projected/0ecd210b-fb48-42fe-b161-6583d913b6f8-kube-api-access-gnf9t\") pod \"cinder-operator-controller-manager-859b6ccc6-rhhdr\" (UID: \"0ecd210b-fb48-42fe-b161-6583d913b6f8\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.748284 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbpw\" (UniqueName: \"kubernetes.io/projected/22f7d8a7-b9bf-40ca-aca3-13a370558f38-kube-api-access-5hbpw\") pod \"designate-operator-controller-manager-78b4bc895b-zgtdz\" (UID: \"22f7d8a7-b9bf-40ca-aca3-13a370558f38\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.748617 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.757318 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.765940 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-tvhbf" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.766629 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.812843 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl54d\" (UniqueName: \"kubernetes.io/projected/a5f4a0b7-d118-45b5-ab87-9f03413d4671-kube-api-access-zl54d\") pod \"horizon-operator-controller-manager-68c6d99b8f-jj85k\" (UID: \"a5f4a0b7-d118-45b5-ab87-9f03413d4671\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.812883 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbgc\" (UniqueName: \"kubernetes.io/projected/b0dc8ce5-ac38-4bac-8026-5ca446e16340-kube-api-access-dwbgc\") pod \"ironic-operator-controller-manager-6c548fd776-5t4bj\" (UID: \"b0dc8ce5-ac38-4bac-8026-5ca446e16340\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.812919 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84c5h\" (UniqueName: \"kubernetes.io/projected/5e5620ec-6ef3-47fc-b88b-06a2f2849b48-kube-api-access-84c5h\") pod \"heat-operator-controller-manager-5f64f6f8bb-nggxg\" (UID: \"5e5620ec-6ef3-47fc-b88b-06a2f2849b48\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.812958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.812980 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xjp7\" (UniqueName: \"kubernetes.io/projected/4cf3851e-6624-48c2-aa71-e799e6b6b685-kube-api-access-2xjp7\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.813205 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq"] Dec 03 22:23:26 crc kubenswrapper[4830]: E1203 22:23:26.813470 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:26 crc kubenswrapper[4830]: E1203 22:23:26.813530 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert podName:4cf3851e-6624-48c2-aa71-e799e6b6b685 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:27.313500112 +0000 UTC m=+1096.309961461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert") pod "infra-operator-controller-manager-57548d458d-pd68x" (UID: "4cf3851e-6624-48c2-aa71-e799e6b6b685") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.814632 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.823878 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wqkfb" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.832983 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.847409 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.848561 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.850303 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.851525 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.852856 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zzld4" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.855980 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.857430 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wlbq2" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.858655 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c5h\" (UniqueName: \"kubernetes.io/projected/5e5620ec-6ef3-47fc-b88b-06a2f2849b48-kube-api-access-84c5h\") pod \"heat-operator-controller-manager-5f64f6f8bb-nggxg\" (UID: \"5e5620ec-6ef3-47fc-b88b-06a2f2849b48\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.863685 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.864061 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.866266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl54d\" (UniqueName: \"kubernetes.io/projected/a5f4a0b7-d118-45b5-ab87-9f03413d4671-kube-api-access-zl54d\") pod \"horizon-operator-controller-manager-68c6d99b8f-jj85k\" (UID: \"a5f4a0b7-d118-45b5-ab87-9f03413d4671\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.868822 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.869889 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.885266 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-62v2v" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.885907 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xjp7\" (UniqueName: \"kubernetes.io/projected/4cf3851e-6624-48c2-aa71-e799e6b6b685-kube-api-access-2xjp7\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.886158 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.886482 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.895731 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.897209 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.903720 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-526ng"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.904845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.906200 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-xmhcb" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.909323 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.911796 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9cck\" (UniqueName: \"kubernetes.io/projected/325f811a-891b-48ae-bde4-a72e7580c925-kube-api-access-l9cck\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48cc9\" (UID: \"325f811a-891b-48ae-bde4-a72e7580c925\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914315 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hlzn\" (UniqueName: \"kubernetes.io/projected/e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae-kube-api-access-6hlzn\") pod \"keystone-operator-controller-manager-7765d96ddf-4fshq\" (UID: \"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9rd5\" (UniqueName: \"kubernetes.io/projected/90ea4083-18d1-4ace-bcc6-81489c41f117-kube-api-access-v9rd5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v2gwm\" (UID: \"90ea4083-18d1-4ace-bcc6-81489c41f117\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sql84\" (UniqueName: \"kubernetes.io/projected/01175cc5-e6fa-4e26-b76b-6b7e2a71d51a-kube-api-access-sql84\") pod \"manila-operator-controller-manager-7c79b5df47-8wlrd\" (UID: \"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqfp\" (UniqueName: \"kubernetes.io/projected/28b1972b-42aa-4470-8be6-240b219e5975-kube-api-access-mjqfp\") pod \"nova-operator-controller-manager-697bc559fc-526ng\" (UID: \"28b1972b-42aa-4470-8be6-240b219e5975\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfdc\" (UniqueName: \"kubernetes.io/projected/d18e5b1e-653d-4c0e-928f-a2d60b2af855-kube-api-access-6wfdc\") pod \"octavia-operator-controller-manager-998648c74-rmd7h\" (UID: \"d18e5b1e-653d-4c0e-928f-a2d60b2af855\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.914539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbgc\" (UniqueName: \"kubernetes.io/projected/b0dc8ce5-ac38-4bac-8026-5ca446e16340-kube-api-access-dwbgc\") pod \"ironic-operator-controller-manager-6c548fd776-5t4bj\" (UID: \"b0dc8ce5-ac38-4bac-8026-5ca446e16340\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.919534 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w7m92" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.921559 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-526ng"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.927308 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.929255 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.933053 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.933953 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k9lml" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.934395 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.934685 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.934999 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.937890 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nx74z" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.944367 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.947243 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.958045 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.964525 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbgc\" (UniqueName: \"kubernetes.io/projected/b0dc8ce5-ac38-4bac-8026-5ca446e16340-kube-api-access-dwbgc\") pod \"ironic-operator-controller-manager-6c548fd776-5t4bj\" (UID: \"b0dc8ce5-ac38-4bac-8026-5ca446e16340\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.964584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.971227 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd"] Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.974665 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:23:26 crc kubenswrapper[4830]: I1203 22:23:26.995562 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bl9jv" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015138 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015768 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tps8v\" (UniqueName: \"kubernetes.io/projected/908c7892-9ff8-4d17-86ea-2daf891ea90b-kube-api-access-tps8v\") pod \"ovn-operator-controller-manager-b6456fdb6-vwd82\" (UID: \"908c7892-9ff8-4d17-86ea-2daf891ea90b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hlzn\" (UniqueName: \"kubernetes.io/projected/e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae-kube-api-access-6hlzn\") pod \"keystone-operator-controller-manager-7765d96ddf-4fshq\" (UID: \"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015822 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hzd\" (UniqueName: \"kubernetes.io/projected/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-kube-api-access-d9hzd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9rd5\" (UniqueName: \"kubernetes.io/projected/90ea4083-18d1-4ace-bcc6-81489c41f117-kube-api-access-v9rd5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v2gwm\" (UID: \"90ea4083-18d1-4ace-bcc6-81489c41f117\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sql84\" (UniqueName: \"kubernetes.io/projected/01175cc5-e6fa-4e26-b76b-6b7e2a71d51a-kube-api-access-sql84\") pod \"manila-operator-controller-manager-7c79b5df47-8wlrd\" (UID: \"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqfp\" (UniqueName: \"kubernetes.io/projected/28b1972b-42aa-4470-8be6-240b219e5975-kube-api-access-mjqfp\") pod \"nova-operator-controller-manager-697bc559fc-526ng\" (UID: \"28b1972b-42aa-4470-8be6-240b219e5975\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfdc\" (UniqueName: \"kubernetes.io/projected/d18e5b1e-653d-4c0e-928f-a2d60b2af855-kube-api-access-6wfdc\") pod \"octavia-operator-controller-manager-998648c74-rmd7h\" (UID: \"d18e5b1e-653d-4c0e-928f-a2d60b2af855\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015928 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlmz\" (UniqueName: \"kubernetes.io/projected/55d8936f-55fc-4a92-b8a2-c393b6b46eeb-kube-api-access-4hlmz\") pod \"placement-operator-controller-manager-78f8948974-rs6wd\" (UID: \"55d8936f-55fc-4a92-b8a2-c393b6b46eeb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015961 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9cck\" (UniqueName: \"kubernetes.io/projected/325f811a-891b-48ae-bde4-a72e7580c925-kube-api-access-l9cck\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48cc9\" (UID: \"325f811a-891b-48ae-bde4-a72e7580c925\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.015985 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.054349 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfdc\" (UniqueName: \"kubernetes.io/projected/d18e5b1e-653d-4c0e-928f-a2d60b2af855-kube-api-access-6wfdc\") pod \"octavia-operator-controller-manager-998648c74-rmd7h\" (UID: \"d18e5b1e-653d-4c0e-928f-a2d60b2af855\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.059242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9cck\" (UniqueName: \"kubernetes.io/projected/325f811a-891b-48ae-bde4-a72e7580c925-kube-api-access-l9cck\") pod \"mariadb-operator-controller-manager-56bbcc9d85-48cc9\" (UID: \"325f811a-891b-48ae-bde4-a72e7580c925\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.059440 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hlzn\" (UniqueName: \"kubernetes.io/projected/e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae-kube-api-access-6hlzn\") pod \"keystone-operator-controller-manager-7765d96ddf-4fshq\" (UID: \"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.064185 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sql84\" (UniqueName: \"kubernetes.io/projected/01175cc5-e6fa-4e26-b76b-6b7e2a71d51a-kube-api-access-sql84\") pod \"manila-operator-controller-manager-7c79b5df47-8wlrd\" (UID: \"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.065134 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.066275 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.069095 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cmgwx" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.070969 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqfp\" (UniqueName: \"kubernetes.io/projected/28b1972b-42aa-4470-8be6-240b219e5975-kube-api-access-mjqfp\") pod \"nova-operator-controller-manager-697bc559fc-526ng\" (UID: \"28b1972b-42aa-4470-8be6-240b219e5975\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.080751 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.082843 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9rd5\" (UniqueName: \"kubernetes.io/projected/90ea4083-18d1-4ace-bcc6-81489c41f117-kube-api-access-v9rd5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-v2gwm\" (UID: \"90ea4083-18d1-4ace-bcc6-81489c41f117\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.128910 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.130301 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.133243 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lfx7r" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.137752 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.137828 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tps8v\" (UniqueName: \"kubernetes.io/projected/908c7892-9ff8-4d17-86ea-2daf891ea90b-kube-api-access-tps8v\") pod \"ovn-operator-controller-manager-b6456fdb6-vwd82\" (UID: \"908c7892-9ff8-4d17-86ea-2daf891ea90b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.137880 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hzd\" (UniqueName: \"kubernetes.io/projected/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-kube-api-access-d9hzd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.137926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbtg\" (UniqueName: \"kubernetes.io/projected/84a09470-19ba-4bef-b0de-1fa4df1561ae-kube-api-access-mfbtg\") pod \"swift-operator-controller-manager-5f8c65bbfc-sqmr8\" (UID: \"84a09470-19ba-4bef-b0de-1fa4df1561ae\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.137976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlmz\" (UniqueName: \"kubernetes.io/projected/55d8936f-55fc-4a92-b8a2-c393b6b46eeb-kube-api-access-4hlmz\") pod \"placement-operator-controller-manager-78f8948974-rs6wd\" (UID: \"55d8936f-55fc-4a92-b8a2-c393b6b46eeb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.138243 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.138294 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:27.638280018 +0000 UTC m=+1096.634741367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.144636 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.145019 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.174186 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.175912 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.181181 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vq42m" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.188911 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tps8v\" (UniqueName: \"kubernetes.io/projected/908c7892-9ff8-4d17-86ea-2daf891ea90b-kube-api-access-tps8v\") pod \"ovn-operator-controller-manager-b6456fdb6-vwd82\" (UID: \"908c7892-9ff8-4d17-86ea-2daf891ea90b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.189796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hzd\" (UniqueName: \"kubernetes.io/projected/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-kube-api-access-d9hzd\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.189927 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlmz\" (UniqueName: \"kubernetes.io/projected/55d8936f-55fc-4a92-b8a2-c393b6b46eeb-kube-api-access-4hlmz\") pod \"placement-operator-controller-manager-78f8948974-rs6wd\" (UID: \"55d8936f-55fc-4a92-b8a2-c393b6b46eeb\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.196708 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.209092 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.239588 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbtg\" (UniqueName: \"kubernetes.io/projected/84a09470-19ba-4bef-b0de-1fa4df1561ae-kube-api-access-mfbtg\") pod \"swift-operator-controller-manager-5f8c65bbfc-sqmr8\" (UID: \"84a09470-19ba-4bef-b0de-1fa4df1561ae\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.239958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.255347 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.261358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbtg\" (UniqueName: \"kubernetes.io/projected/84a09470-19ba-4bef-b0de-1fa4df1561ae-kube-api-access-mfbtg\") pod \"swift-operator-controller-manager-5f8c65bbfc-sqmr8\" (UID: \"84a09470-19ba-4bef-b0de-1fa4df1561ae\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.265650 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.286386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.291082 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.293790 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.296592 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wkmwr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.300479 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.319787 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.321828 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.323587 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.330286 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.330802 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.330981 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.337802 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tw786" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.337968 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.338876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.344918 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pc8\" (UniqueName: \"kubernetes.io/projected/0c670280-553c-4251-ac28-04fdd66313a7-kube-api-access-f5pc8\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.344991 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pj6\" (UniqueName: \"kubernetes.io/projected/dec06c39-2a96-4fc6-a2e2-ad865fc394d9-kube-api-access-d8pj6\") pod \"watcher-operator-controller-manager-769dc69bc-dtdgr\" (UID: \"dec06c39-2a96-4fc6-a2e2-ad865fc394d9\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbn8f\" (UniqueName: \"kubernetes.io/projected/d5a49c34-e03d-49b5-a5a8-507af8ce99be-kube-api-access-kbn8f\") pod \"test-operator-controller-manager-5854674fcc-7bhdq\" (UID: \"d5a49c34-e03d-49b5-a5a8-507af8ce99be\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345150 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zwq\" (UniqueName: \"kubernetes.io/projected/c0f0376d-c348-4b7b-b4e1-f8717ea05299-kube-api-access-49zwq\") pod \"telemetry-operator-controller-manager-59779d887b-2cqbq\" (UID: \"c0f0376d-c348-4b7b-b4e1-f8717ea05299\") " pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.345204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz647\" (UniqueName: \"kubernetes.io/projected/27ed445d-9111-479a-8dc5-5808e0af45be-kube-api-access-fz647\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h5tll\" (UID: \"27ed445d-9111-479a-8dc5-5808e0af45be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.345395 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.345459 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert podName:4cf3851e-6624-48c2-aa71-e799e6b6b685 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:28.345438844 +0000 UTC m=+1097.341900193 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert") pod "infra-operator-controller-manager-57548d458d-pd68x" (UID: "4cf3851e-6624-48c2-aa71-e799e6b6b685") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.346170 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-pfpfr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.372131 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.392523 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.411038 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.435715 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.446808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbn8f\" (UniqueName: \"kubernetes.io/projected/d5a49c34-e03d-49b5-a5a8-507af8ce99be-kube-api-access-kbn8f\") pod \"test-operator-controller-manager-5854674fcc-7bhdq\" (UID: \"d5a49c34-e03d-49b5-a5a8-507af8ce99be\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.446883 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.446911 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zwq\" (UniqueName: \"kubernetes.io/projected/c0f0376d-c348-4b7b-b4e1-f8717ea05299-kube-api-access-49zwq\") pod \"telemetry-operator-controller-manager-59779d887b-2cqbq\" (UID: \"c0f0376d-c348-4b7b-b4e1-f8717ea05299\") " pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.446949 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz647\" (UniqueName: \"kubernetes.io/projected/27ed445d-9111-479a-8dc5-5808e0af45be-kube-api-access-fz647\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h5tll\" (UID: \"27ed445d-9111-479a-8dc5-5808e0af45be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.447009 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pc8\" (UniqueName: \"kubernetes.io/projected/0c670280-553c-4251-ac28-04fdd66313a7-kube-api-access-f5pc8\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.447056 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pj6\" (UniqueName: \"kubernetes.io/projected/dec06c39-2a96-4fc6-a2e2-ad865fc394d9-kube-api-access-d8pj6\") pod \"watcher-operator-controller-manager-769dc69bc-dtdgr\" (UID: \"dec06c39-2a96-4fc6-a2e2-ad865fc394d9\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.447088 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.448581 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.448637 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:27.948621311 +0000 UTC m=+1096.945082660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.447246 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.449438 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:27.949426883 +0000 UTC m=+1096.945888232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.490486 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz647\" (UniqueName: \"kubernetes.io/projected/27ed445d-9111-479a-8dc5-5808e0af45be-kube-api-access-fz647\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h5tll\" (UID: \"27ed445d-9111-479a-8dc5-5808e0af45be\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.491454 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbn8f\" (UniqueName: \"kubernetes.io/projected/d5a49c34-e03d-49b5-a5a8-507af8ce99be-kube-api-access-kbn8f\") pod \"test-operator-controller-manager-5854674fcc-7bhdq\" (UID: \"d5a49c34-e03d-49b5-a5a8-507af8ce99be\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.498253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pj6\" (UniqueName: \"kubernetes.io/projected/dec06c39-2a96-4fc6-a2e2-ad865fc394d9-kube-api-access-d8pj6\") pod \"watcher-operator-controller-manager-769dc69bc-dtdgr\" (UID: \"dec06c39-2a96-4fc6-a2e2-ad865fc394d9\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.515622 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pc8\" (UniqueName: \"kubernetes.io/projected/0c670280-553c-4251-ac28-04fdd66313a7-kube-api-access-f5pc8\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.545686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zwq\" (UniqueName: \"kubernetes.io/projected/c0f0376d-c348-4b7b-b4e1-f8717ea05299-kube-api-access-49zwq\") pod \"telemetry-operator-controller-manager-59779d887b-2cqbq\" (UID: \"c0f0376d-c348-4b7b-b4e1-f8717ea05299\") " pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.546385 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.651685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.651838 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.651883 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:28.651869881 +0000 UTC m=+1097.648331220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.739331 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.777929 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.785728 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24"] Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.785737 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.955671 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: I1203 22:23:27.955737 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.955875 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.955921 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:28.955907002 +0000 UTC m=+1097.952368351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.955986 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:27 crc kubenswrapper[4830]: E1203 22:23:27.956055 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:28.956038856 +0000 UTC m=+1097.952500195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.310932 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" event={"ID":"cca79891-68e7-4827-8da4-c0570dbca762","Type":"ContainerStarted","Data":"6fceea2bbc14ed4a851b097c00e8524161fd1b8860b095a8525d8ce54950dbbf"} Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.317438 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.351401 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.365197 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.366391 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.366452 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert podName:4cf3851e-6624-48c2-aa71-e799e6b6b685 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:30.366434002 +0000 UTC m=+1099.362895351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert") pod "infra-operator-controller-manager-57548d458d-pd68x" (UID: "4cf3851e-6624-48c2-aa71-e799e6b6b685") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.372857 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.597758 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.608370 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.623632 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.629889 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.654539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.673265 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.683352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.683724 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.683786 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:30.683765694 +0000 UTC m=+1099.680227043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.690284 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.702134 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-526ng"] Dec 03 22:23:28 crc kubenswrapper[4830]: W1203 22:23:28.713615 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28b1972b_42aa_4470_8be6_240b219e5975.slice/crio-86a1efcb424302c70b0d280cf0180a758727a74f7bcdaebff3820b1556daedfa WatchSource:0}: Error finding container 86a1efcb424302c70b0d280cf0180a758727a74f7bcdaebff3820b1556daedfa: Status 404 returned error can't find the container with id 86a1efcb424302c70b0d280cf0180a758727a74f7bcdaebff3820b1556daedfa Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.721753 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.726492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.733286 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjqfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-526ng_openstack-operators(28b1972b-42aa-4470-8be6-240b219e5975): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.733348 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84c5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-nggxg_openstack-operators(5e5620ec-6ef3-47fc-b88b-06a2f2849b48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.733799 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fz647,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-h5tll_openstack-operators(27ed445d-9111-479a-8dc5-5808e0af45be): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.734952 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" podUID="27ed445d-9111-479a-8dc5-5808e0af45be" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.736803 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjqfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-526ng_openstack-operators(28b1972b-42aa-4470-8be6-240b219e5975): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.736828 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.736881 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84c5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-nggxg_openstack-operators(5e5620ec-6ef3-47fc-b88b-06a2f2849b48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.738558 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" podUID="5e5620ec-6ef3-47fc-b88b-06a2f2849b48" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.738617 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" podUID="28b1972b-42aa-4470-8be6-240b219e5975" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.740161 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8pj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-dtdgr_openstack-operators(dec06c39-2a96-4fc6-a2e2-ad865fc394d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.742105 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8pj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-dtdgr_openstack-operators(dec06c39-2a96-4fc6-a2e2-ad865fc394d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.743597 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" podUID="dec06c39-2a96-4fc6-a2e2-ad865fc394d9" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.743965 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.747034 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9cck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-48cc9_openstack-operators(325f811a-891b-48ae-bde4-a72e7580c925): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.747577 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.41:5001/openstack-k8s-operators/telemetry-operator:8eb50ce62a3905aa1a1da4c6aeb639c250edca21,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49zwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-59779d887b-2cqbq_openstack-operators(c0f0376d-c348-4b7b-b4e1-f8717ea05299): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.750246 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.750848 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49zwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-59779d887b-2cqbq_openstack-operators(c0f0376d-c348-4b7b-b4e1-f8717ea05299): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.750930 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9cck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-48cc9_openstack-operators(325f811a-891b-48ae-bde4-a72e7580c925): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.752109 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" podUID="325f811a-891b-48ae-bde4-a72e7580c925" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.752185 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" podUID="c0f0376d-c348-4b7b-b4e1-f8717ea05299" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.755967 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.760604 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbn8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-7bhdq_openstack-operators(d5a49c34-e03d-49b5-a5a8-507af8ce99be): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.763193 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq"] Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.769262 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbn8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-7bhdq_openstack-operators(d5a49c34-e03d-49b5-a5a8-507af8ce99be): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.772076 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" podUID="d5a49c34-e03d-49b5-a5a8-507af8ce99be" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.774857 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq"] Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.988167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:28 crc kubenswrapper[4830]: I1203 22:23:28.988248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.988380 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.988375 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.988431 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:30.988417593 +0000 UTC m=+1099.984878942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:28 crc kubenswrapper[4830]: E1203 22:23:28.988466 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:30.988441973 +0000 UTC m=+1099.984903392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.321361 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" event={"ID":"b0dc8ce5-ac38-4bac-8026-5ca446e16340","Type":"ContainerStarted","Data":"fd398af918372a12f98e3e8c9c60e7ea1edfe9c14c3ad2d50bc2c66e6581e881"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.323437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" event={"ID":"55d8936f-55fc-4a92-b8a2-c393b6b46eeb","Type":"ContainerStarted","Data":"9aa37e53918f898b87299f49a1b0af1ef610506fcf5bf019f7bff9e1595efee9"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.324762 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" event={"ID":"0ecd210b-fb48-42fe-b161-6583d913b6f8","Type":"ContainerStarted","Data":"99d7d7071d9dac9e9b226d149e8497d50f3fd90212921ee93e239706c638e090"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.327056 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" event={"ID":"d5a49c34-e03d-49b5-a5a8-507af8ce99be","Type":"ContainerStarted","Data":"b45d40ef86e7f70b019dfde5199592031535342433c38c498536c0772f0ea258"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.329093 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" event={"ID":"22f7d8a7-b9bf-40ca-aca3-13a370558f38","Type":"ContainerStarted","Data":"f5aec8aac1588c514426b4a5d7985f8b9461901833f85b99ec4aff8d1d736fa4"} Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.329388 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" podUID="d5a49c34-e03d-49b5-a5a8-507af8ce99be" Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.340375 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" podUID="325f811a-891b-48ae-bde4-a72e7580c925" Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.344781 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.41:5001/openstack-k8s-operators/telemetry-operator:8eb50ce62a3905aa1a1da4c6aeb639c250edca21\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" podUID="c0f0376d-c348-4b7b-b4e1-f8717ea05299" Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" event={"ID":"325f811a-891b-48ae-bde4-a72e7580c925","Type":"ContainerStarted","Data":"6952335ffea5084f9c6179a6dee145ccd3aca39905238b3d216752e610de23ed"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368185 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" event={"ID":"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae","Type":"ContainerStarted","Data":"25c3aed212c9910c0bc3562033da0cb5e1448d20078656148c5cb8a42f042756"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" event={"ID":"c0f0376d-c348-4b7b-b4e1-f8717ea05299","Type":"ContainerStarted","Data":"b8f4edcbb1a9321b8dd2991b696cc55d838bd2dd1e6176c671c48cd166cad3f2"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" event={"ID":"908c7892-9ff8-4d17-86ea-2daf891ea90b","Type":"ContainerStarted","Data":"7b8ea8ac579dacc0b00d11b5302b6ee6cd1855f4d211084fd6b3b02522826f97"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" event={"ID":"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac","Type":"ContainerStarted","Data":"38150d9060e16bc002285d36d7ceb3d3d7d4b795036e491bf140cb7b6d90b313"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.368228 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" event={"ID":"84a09470-19ba-4bef-b0de-1fa4df1561ae","Type":"ContainerStarted","Data":"cfb98f5afae9ded38c6f5a82b8d57c5b0630fff0d76613fa4f7c7cb450abc1ef"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.373613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" event={"ID":"90ea4083-18d1-4ace-bcc6-81489c41f117","Type":"ContainerStarted","Data":"d60ddf9d55efbb0761aeb273f41b39bc70df24e9d5990c74030e370d9bd42853"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.375573 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" event={"ID":"5e5620ec-6ef3-47fc-b88b-06a2f2849b48","Type":"ContainerStarted","Data":"690b9fd33d10495b7ea230f357362bf8e78483530c86b090519bbfeecc184396"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.381250 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" event={"ID":"27ed445d-9111-479a-8dc5-5808e0af45be","Type":"ContainerStarted","Data":"2b74dfbcacc360cdbca3543d8e437ebd0cdcc1fbb1e9f42b9bdd23d70aa7127c"} Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.381541 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" podUID="5e5620ec-6ef3-47fc-b88b-06a2f2849b48" Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.383356 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" podUID="27ed445d-9111-479a-8dc5-5808e0af45be" Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.383473 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" event={"ID":"28b1972b-42aa-4470-8be6-240b219e5975","Type":"ContainerStarted","Data":"86a1efcb424302c70b0d280cf0180a758727a74f7bcdaebff3820b1556daedfa"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.390237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" event={"ID":"d18e5b1e-653d-4c0e-928f-a2d60b2af855","Type":"ContainerStarted","Data":"06d52c809083009092a56a682bb76070f3e0c578006a56d39f5ddf7e18c602fc"} Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.390378 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" podUID="28b1972b-42aa-4470-8be6-240b219e5975" Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.399851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" event={"ID":"a5f4a0b7-d118-45b5-ab87-9f03413d4671","Type":"ContainerStarted","Data":"b0e3bf7cf08b066481f7b0def64dcbbd0ade21337f13a9f064794fa36f2135b2"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.402577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" event={"ID":"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a","Type":"ContainerStarted","Data":"cdffd849880f165cdb045a600dc9298db9565fe9dbf25a17e5ba2c4c9736d14a"} Dec 03 22:23:29 crc kubenswrapper[4830]: I1203 22:23:29.403783 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" event={"ID":"dec06c39-2a96-4fc6-a2e2-ad865fc394d9","Type":"ContainerStarted","Data":"b9ef1966bbebdd7d14730f6a3a8b3e9aee7e0e97060b513ab8f51c626ac8b3ac"} Dec 03 22:23:29 crc kubenswrapper[4830]: E1203 22:23:29.408595 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" podUID="dec06c39-2a96-4fc6-a2e2-ad865fc394d9" Dec 03 22:23:30 crc kubenswrapper[4830]: I1203 22:23:30.412876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.413006 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.413041 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert podName:4cf3851e-6624-48c2-aa71-e799e6b6b685 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:34.41302875 +0000 UTC m=+1103.409490099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert") pod "infra-operator-controller-manager-57548d458d-pd68x" (UID: "4cf3851e-6624-48c2-aa71-e799e6b6b685") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.415821 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" podUID="27ed445d-9111-479a-8dc5-5808e0af45be" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.415967 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" podUID="28b1972b-42aa-4470-8be6-240b219e5975" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.416087 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" podUID="325f811a-891b-48ae-bde4-a72e7580c925" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.416135 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" podUID="dec06c39-2a96-4fc6-a2e2-ad865fc394d9" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.416960 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" podUID="5e5620ec-6ef3-47fc-b88b-06a2f2849b48" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.417069 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.41:5001/openstack-k8s-operators/telemetry-operator:8eb50ce62a3905aa1a1da4c6aeb639c250edca21\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" podUID="c0f0376d-c348-4b7b-b4e1-f8717ea05299" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.417620 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" podUID="d5a49c34-e03d-49b5-a5a8-507af8ce99be" Dec 03 22:23:30 crc kubenswrapper[4830]: I1203 22:23:30.718318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.718484 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:30 crc kubenswrapper[4830]: E1203 22:23:30.718547 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:34.718531791 +0000 UTC m=+1103.714993140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:31 crc kubenswrapper[4830]: I1203 22:23:31.021575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:31 crc kubenswrapper[4830]: I1203 22:23:31.021669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:31 crc kubenswrapper[4830]: E1203 22:23:31.021718 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:31 crc kubenswrapper[4830]: E1203 22:23:31.021753 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:31 crc kubenswrapper[4830]: E1203 22:23:31.021771 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:35.021755871 +0000 UTC m=+1104.018217220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:31 crc kubenswrapper[4830]: E1203 22:23:31.021785 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:35.021779172 +0000 UTC m=+1104.018240521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:31 crc kubenswrapper[4830]: E1203 22:23:31.425032 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.41:5001/openstack-k8s-operators/telemetry-operator:8eb50ce62a3905aa1a1da4c6aeb639c250edca21\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" podUID="c0f0376d-c348-4b7b-b4e1-f8717ea05299" Dec 03 22:23:34 crc kubenswrapper[4830]: I1203 22:23:34.468615 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:34 crc kubenswrapper[4830]: E1203 22:23:34.468760 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:34 crc kubenswrapper[4830]: E1203 22:23:34.468947 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert podName:4cf3851e-6624-48c2-aa71-e799e6b6b685 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:42.468928242 +0000 UTC m=+1111.465389601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert") pod "infra-operator-controller-manager-57548d458d-pd68x" (UID: "4cf3851e-6624-48c2-aa71-e799e6b6b685") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:23:34 crc kubenswrapper[4830]: I1203 22:23:34.773960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:34 crc kubenswrapper[4830]: E1203 22:23:34.774149 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:34 crc kubenswrapper[4830]: E1203 22:23:34.774462 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:42.774438334 +0000 UTC m=+1111.770899713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:35 crc kubenswrapper[4830]: I1203 22:23:35.078868 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:35 crc kubenswrapper[4830]: I1203 22:23:35.078948 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:35 crc kubenswrapper[4830]: E1203 22:23:35.079030 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:35 crc kubenswrapper[4830]: E1203 22:23:35.079047 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:35 crc kubenswrapper[4830]: E1203 22:23:35.079081 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:43.079065872 +0000 UTC m=+1112.075527221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:35 crc kubenswrapper[4830]: E1203 22:23:35.079096 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:43.079090593 +0000 UTC m=+1112.075551942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:40 crc kubenswrapper[4830]: E1203 22:23:40.454974 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 03 22:23:40 crc kubenswrapper[4830]: E1203 22:23:40.455434 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hlmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-rs6wd_openstack-operators(55d8936f-55fc-4a92-b8a2-c393b6b46eeb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:23:41 crc kubenswrapper[4830]: E1203 22:23:41.101525 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 03 22:23:41 crc kubenswrapper[4830]: E1203 22:23:41.101725 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sql84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-8wlrd_openstack-operators(01175cc5-e6fa-4e26-b76b-6b7e2a71d51a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:23:41 crc kubenswrapper[4830]: E1203 22:23:41.777118 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 22:23:41 crc kubenswrapper[4830]: E1203 22:23:41.777545 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hlzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-4fshq_openstack-operators(e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.495148 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.505112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cf3851e-6624-48c2-aa71-e799e6b6b685-cert\") pod \"infra-operator-controller-manager-57548d458d-pd68x\" (UID: \"4cf3851e-6624-48c2-aa71-e799e6b6b685\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.519247 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" event={"ID":"d18e5b1e-653d-4c0e-928f-a2d60b2af855","Type":"ContainerStarted","Data":"a3cfc9189d539994702598035d63f1321cce682fe485558cf4f81bb502db448a"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.542673 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" event={"ID":"cca79891-68e7-4827-8da4-c0570dbca762","Type":"ContainerStarted","Data":"0f0e22edc70d1ba58c3553fc9ae2d8560c0921c85e8f3233ee5276f5ce699ed6"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.581643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" event={"ID":"84a09470-19ba-4bef-b0de-1fa4df1561ae","Type":"ContainerStarted","Data":"cc05e11395333c018e2d9ea2af4289b3a9934f33b6edbecc1b044d94ba199e59"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.591868 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.592652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" event={"ID":"908c7892-9ff8-4d17-86ea-2daf891ea90b","Type":"ContainerStarted","Data":"6ef1fddff80a884a49fe993c000dd5340b3ab178a278e90b378b56f337dec4e5"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.617973 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" event={"ID":"90ea4083-18d1-4ace-bcc6-81489c41f117","Type":"ContainerStarted","Data":"aeb7aa7503a83460e4e535f82b576e512715954840d89480ed47e1f02fa19b37"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.635125 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" event={"ID":"b0dc8ce5-ac38-4bac-8026-5ca446e16340","Type":"ContainerStarted","Data":"043164e58ac4b699eef7de2aae315700871be7e344e8c4dddf8c523fcaac8a62"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.639744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" event={"ID":"0ecd210b-fb48-42fe-b161-6583d913b6f8","Type":"ContainerStarted","Data":"9718829326bac93dccd5cb88e37c65bdaca40c056a28b77b9fc107bd7b6006bc"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.649924 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" event={"ID":"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac","Type":"ContainerStarted","Data":"34bb06be9c10b74ef644aead8c7c929a4177de5782f3d35949c5f20d6a85c918"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.659774 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" event={"ID":"22f7d8a7-b9bf-40ca-aca3-13a370558f38","Type":"ContainerStarted","Data":"6814bbf1a6994b09b9289d36fc6e1b1240a565c19e4beb1d554d33277fb3dff8"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.668709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" event={"ID":"a5f4a0b7-d118-45b5-ab87-9f03413d4671","Type":"ContainerStarted","Data":"1bf65db9f3b93ca946a7f918088c0b55ccbfb69ef27c11cf42e84db6dc36ce9c"} Dec 03 22:23:42 crc kubenswrapper[4830]: I1203 22:23:42.802312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:42 crc kubenswrapper[4830]: E1203 22:23:42.802485 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:42 crc kubenswrapper[4830]: E1203 22:23:42.802544 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert podName:7d1112c1-ffce-45e1-94a4-3aad2ae50fe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:58.802530423 +0000 UTC m=+1127.798991772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" (UID: "7d1112c1-ffce-45e1-94a4-3aad2ae50fe5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:23:43 crc kubenswrapper[4830]: I1203 22:23:43.109152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:43 crc kubenswrapper[4830]: I1203 22:23:43.109222 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:43 crc kubenswrapper[4830]: E1203 22:23:43.109390 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:23:43 crc kubenswrapper[4830]: E1203 22:23:43.109434 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:59.109419923 +0000 UTC m=+1128.105881272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "metrics-server-cert" not found Dec 03 22:23:43 crc kubenswrapper[4830]: E1203 22:23:43.109805 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:23:43 crc kubenswrapper[4830]: E1203 22:23:43.109830 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs podName:0c670280-553c-4251-ac28-04fdd66313a7 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:59.109823194 +0000 UTC m=+1128.106284543 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs") pod "openstack-operator-controller-manager-6bb8cf96cb-6vrpp" (UID: "0c670280-553c-4251-ac28-04fdd66313a7") : secret "webhook-server-cert" not found Dec 03 22:23:43 crc kubenswrapper[4830]: I1203 22:23:43.114879 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-pd68x"] Dec 03 22:23:43 crc kubenswrapper[4830]: I1203 22:23:43.676722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" event={"ID":"4cf3851e-6624-48c2-aa71-e799e6b6b685","Type":"ContainerStarted","Data":"430a7a1e0e2c8dabdf5587531ec0b7a8a5e3ff3cd19f7144d941e9e45dcb4e6d"} Dec 03 22:23:57 crc kubenswrapper[4830]: E1203 22:23:57.717334 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" podUID="01175cc5-e6fa-4e26-b76b-6b7e2a71d51a" Dec 03 22:23:57 crc kubenswrapper[4830]: E1203 22:23:57.793286 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" podUID="e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.830216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" event={"ID":"b0dc8ce5-ac38-4bac-8026-5ca446e16340","Type":"ContainerStarted","Data":"a40d0e70c34267d38b12ffedaabeefe9cec031d9d4cc3a6193b7aaca7e2965cb"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.830589 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.835545 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.855222 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" event={"ID":"d18e5b1e-653d-4c0e-928f-a2d60b2af855","Type":"ContainerStarted","Data":"f3da264bacf77b4f6e3eea77185a27da9e2a701ef7fcee5858ff2ca694519ecb"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.855297 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.864950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" event={"ID":"c0f0376d-c348-4b7b-b4e1-f8717ea05299","Type":"ContainerStarted","Data":"f705e2dd5adca9f9ec318e014c583ba5c7ad8d11807d216b1ca84a770747e2a7"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.872799 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.875894 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-5t4bj" podStartSLOduration=3.763967767 podStartE2EDuration="31.875881109s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.7398444 +0000 UTC m=+1097.736305749" lastFinishedPulling="2025-12-03 22:23:56.851757742 +0000 UTC m=+1125.848219091" observedRunningTime="2025-12-03 22:23:57.861889068 +0000 UTC m=+1126.858350417" watchObservedRunningTime="2025-12-03 22:23:57.875881109 +0000 UTC m=+1126.872342458" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.876023 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" event={"ID":"dec06c39-2a96-4fc6-a2e2-ad865fc394d9","Type":"ContainerStarted","Data":"491d5e02b8285387ec06b875e38f05d406f8703f5292f4dd7b98795f7b5e7bde"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.890548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" event={"ID":"a5f4a0b7-d118-45b5-ab87-9f03413d4671","Type":"ContainerStarted","Data":"aefebecf1164b478deeaa8cdb78a48c7bd661dc136619a21d39469e69f288326"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.891388 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.894132 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.900615 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" event={"ID":"84a09470-19ba-4bef-b0de-1fa4df1561ae","Type":"ContainerStarted","Data":"c7e2e2cb37853c34c1720e0f82e4f1b65e4595746b7afb195d204eceffddcb9f"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.901077 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.902788 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.913793 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" event={"ID":"27ed445d-9111-479a-8dc5-5808e0af45be","Type":"ContainerStarted","Data":"395ec207b5b45a6a55cf8a20207505cec3a709c67f93a1d87d4af43874e3af37"} Dec 03 22:23:57 crc kubenswrapper[4830]: E1203 22:23:57.917286 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" podUID="55d8936f-55fc-4a92-b8a2-c393b6b46eeb" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.921144 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rmd7h" podStartSLOduration=3.748371793 podStartE2EDuration="31.921124841s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.710245795 +0000 UTC m=+1097.706707144" lastFinishedPulling="2025-12-03 22:23:56.882998843 +0000 UTC m=+1125.879460192" observedRunningTime="2025-12-03 22:23:57.916945187 +0000 UTC m=+1126.913406536" watchObservedRunningTime="2025-12-03 22:23:57.921124841 +0000 UTC m=+1126.917586190" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.960035 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" event={"ID":"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a","Type":"ContainerStarted","Data":"a837d8dbfdcfd03c28508d6bba1cffe465fcb49e02bdf52bb3ead0c0540c9e5e"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.968122 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h5tll" podStartSLOduration=3.9292063969999997 podStartE2EDuration="30.968101611s" podCreationTimestamp="2025-12-03 22:23:27 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.733638071 +0000 UTC m=+1097.730099420" lastFinishedPulling="2025-12-03 22:23:55.772533285 +0000 UTC m=+1124.768994634" observedRunningTime="2025-12-03 22:23:57.964603275 +0000 UTC m=+1126.961064624" watchObservedRunningTime="2025-12-03 22:23:57.968101611 +0000 UTC m=+1126.964562950" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.992633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" event={"ID":"908c7892-9ff8-4d17-86ea-2daf891ea90b","Type":"ContainerStarted","Data":"c9b78362f059993b08113de960d2b7b233179fca9ae1b0c25933e4761ede84db"} Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.994580 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:57 crc kubenswrapper[4830]: I1203 22:23:57.998542 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.006798 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-sqmr8" podStartSLOduration=3.788271402 podStartE2EDuration="32.006779254s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.633439675 +0000 UTC m=+1097.629901024" lastFinishedPulling="2025-12-03 22:23:56.851947517 +0000 UTC m=+1125.848408876" observedRunningTime="2025-12-03 22:23:58.004352798 +0000 UTC m=+1127.000814147" watchObservedRunningTime="2025-12-03 22:23:58.006779254 +0000 UTC m=+1127.003240603" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.010432 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" event={"ID":"5e5620ec-6ef3-47fc-b88b-06a2f2849b48","Type":"ContainerStarted","Data":"5137cb43339f621c6a587d0bd4342591e19be17d602980b144d3f36904f0c54a"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.026898 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" event={"ID":"d5a49c34-e03d-49b5-a5a8-507af8ce99be","Type":"ContainerStarted","Data":"d95cb8dfce07cd1b533cd5194a1e169e4a3d9b2662e30cad99cd8cc6adc6810b"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.059162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" event={"ID":"4cf3851e-6624-48c2-aa71-e799e6b6b685","Type":"ContainerStarted","Data":"725726bbaa841aa39f1a525d058be1ef411943a2bb0e24f32f6f3e7519f3f312"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.066612 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-jj85k" podStartSLOduration=3.9529224640000002 podStartE2EDuration="32.066571233s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.732688066 +0000 UTC m=+1097.729149415" lastFinishedPulling="2025-12-03 22:23:56.846336835 +0000 UTC m=+1125.842798184" observedRunningTime="2025-12-03 22:23:58.060182408 +0000 UTC m=+1127.056643757" watchObservedRunningTime="2025-12-03 22:23:58.066571233 +0000 UTC m=+1127.063032582" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.086703 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" event={"ID":"28b1972b-42aa-4470-8be6-240b219e5975","Type":"ContainerStarted","Data":"db40b24b2d2a1ce80ee41ff992b51df12cf6c437788036def416172822ae3334"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.087369 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.093005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" event={"ID":"325f811a-891b-48ae-bde4-a72e7580c925","Type":"ContainerStarted","Data":"fd3b2a8741e19378c9651c004712e326e34f0c14c788143ee18fd75ab0b7a8d3"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.097359 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" event={"ID":"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae","Type":"ContainerStarted","Data":"c588f00727c85410058fb858340e0429d51ee4219e17f38cef0dd0f70d6d4d8a"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.102305 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" event={"ID":"90ea4083-18d1-4ace-bcc6-81489c41f117","Type":"ContainerStarted","Data":"43c7a31d0a8b9f13c35583d62c592a86982a4cae0de548277ebe21a6a64aadd2"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.102834 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.104407 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.105819 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" event={"ID":"22f7d8a7-b9bf-40ca-aca3-13a370558f38","Type":"ContainerStarted","Data":"1bce8792ec22755aaeb4be29f38a6d1aec4a0306ecee704dd606765648742896"} Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.106464 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.108885 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.126404 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" podStartSLOduration=5.181093028 podStartE2EDuration="32.126387612s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.733182699 +0000 UTC m=+1097.729644048" lastFinishedPulling="2025-12-03 22:23:55.678477293 +0000 UTC m=+1124.674938632" observedRunningTime="2025-12-03 22:23:58.117567832 +0000 UTC m=+1127.114029171" watchObservedRunningTime="2025-12-03 22:23:58.126387612 +0000 UTC m=+1127.122848961" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.201995 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-vwd82" podStartSLOduration=4.012598321 podStartE2EDuration="32.201980351s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.691260499 +0000 UTC m=+1097.687721848" lastFinishedPulling="2025-12-03 22:23:56.880642529 +0000 UTC m=+1125.877103878" observedRunningTime="2025-12-03 22:23:58.196577154 +0000 UTC m=+1127.193038503" watchObservedRunningTime="2025-12-03 22:23:58.201980351 +0000 UTC m=+1127.198441700" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.238692 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-v2gwm" podStartSLOduration=4.076336545 podStartE2EDuration="32.23867386s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.705204268 +0000 UTC m=+1097.701665617" lastFinishedPulling="2025-12-03 22:23:56.867541583 +0000 UTC m=+1125.864002932" observedRunningTime="2025-12-03 22:23:58.23461711 +0000 UTC m=+1127.231078459" watchObservedRunningTime="2025-12-03 22:23:58.23867386 +0000 UTC m=+1127.235135209" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.261582 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-zgtdz" podStartSLOduration=3.800283067 podStartE2EDuration="32.261560924s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.395091641 +0000 UTC m=+1097.391552990" lastFinishedPulling="2025-12-03 22:23:56.856369498 +0000 UTC m=+1125.852830847" observedRunningTime="2025-12-03 22:23:58.253487125 +0000 UTC m=+1127.249948474" watchObservedRunningTime="2025-12-03 22:23:58.261560924 +0000 UTC m=+1127.258022273" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.863335 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:58 crc kubenswrapper[4830]: I1203 22:23:58.881191 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d1112c1-ffce-45e1-94a4-3aad2ae50fe5-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h\" (UID: \"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.113935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" event={"ID":"5e5620ec-6ef3-47fc-b88b-06a2f2849b48","Type":"ContainerStarted","Data":"7c95304ddeb9864883754047aea25c118ca95c50ec4b476669ca0507907569c0"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.114009 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.116091 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" event={"ID":"d5a49c34-e03d-49b5-a5a8-507af8ce99be","Type":"ContainerStarted","Data":"342ff212f68fd545bcb45a72ca911ae5508d7f96dc8679033a0c34caf60855f5"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.116482 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.117946 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" event={"ID":"325f811a-891b-48ae-bde4-a72e7580c925","Type":"ContainerStarted","Data":"b45ca9c186ef7038907edaaf76785594bea68ca64ad31a7842faf0eab8897261"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.118329 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.121445 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" event={"ID":"55d8936f-55fc-4a92-b8a2-c393b6b46eeb","Type":"ContainerStarted","Data":"eb73cfe0523b7b9ed9466e01eb6b4b27d05c9933b5b056c20e496f2e47ca468e"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.124494 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" event={"ID":"01175cc5-e6fa-4e26-b76b-6b7e2a71d51a","Type":"ContainerStarted","Data":"868eadbb2787fc032fbc498e82549ba300c11b84846bc105a7527ed24d9e8111"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.124596 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.126376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" event={"ID":"c0f0376d-c348-4b7b-b4e1-f8717ea05299","Type":"ContainerStarted","Data":"c19e842b6d52f4f6e1836ffb68eb7cc08aef23d1ff21441c434afb19acea4dc6"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.126410 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.130179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" event={"ID":"dec06c39-2a96-4fc6-a2e2-ad865fc394d9","Type":"ContainerStarted","Data":"a760fb5d53a8d679b35a7fbd06123ed943952353cb1f47a9325fcf1630b2cf0e"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.130287 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.131548 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.132917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" event={"ID":"e8bc8bcd-fde2-43fd-86ae-814182f2f5ac","Type":"ContainerStarted","Data":"5fcc54508abc67e4ce8cf4a5c4b06a753ffd9f4ce97a6c34329ec668000495ab"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.133173 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.135747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" event={"ID":"28b1972b-42aa-4470-8be6-240b219e5975","Type":"ContainerStarted","Data":"bc2cb2c53c6de9709f06a94c549aeab13e9c38c026d4229339b4abcc39d70b81"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.136897 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.136873 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" podStartSLOduration=6.098823858 podStartE2EDuration="33.136854167s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.733250051 +0000 UTC m=+1097.729711390" lastFinishedPulling="2025-12-03 22:23:55.77128035 +0000 UTC m=+1124.767741699" observedRunningTime="2025-12-03 22:23:59.132054906 +0000 UTC m=+1128.128516255" watchObservedRunningTime="2025-12-03 22:23:59.136854167 +0000 UTC m=+1128.133315516" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.137532 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" event={"ID":"e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae","Type":"ContainerStarted","Data":"146efab916c5e30ffe03004ed0b4b125575f3915ce5127cd8bcd68068f5421b2"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.137780 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.139333 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" event={"ID":"cca79891-68e7-4827-8da4-c0570dbca762","Type":"ContainerStarted","Data":"06c1b41ca0a22b6b4f7655785d37032c43fb28752079b68da8e8b8ca71a3cf0c"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.139744 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.141900 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.142202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" event={"ID":"0ecd210b-fb48-42fe-b161-6583d913b6f8","Type":"ContainerStarted","Data":"dc53302a36694b9ea0e2f94eda927202c2330529137343eaafe52b9a9fef2650"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.142613 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.144981 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.145014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" event={"ID":"4cf3851e-6624-48c2-aa71-e799e6b6b685","Type":"ContainerStarted","Data":"d126b99bcdf6379925ae2bd1613d574e287cbfebc535fd2f3fb73d63512a01cd"} Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.168792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.168880 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.173321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-metrics-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.176245 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0c670280-553c-4251-ac28-04fdd66313a7-webhook-certs\") pod \"openstack-operator-controller-manager-6bb8cf96cb-6vrpp\" (UID: \"0c670280-553c-4251-ac28-04fdd66313a7\") " pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.179973 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" podStartSLOduration=6.32883481 podStartE2EDuration="33.17996017s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.740121237 +0000 UTC m=+1097.736582586" lastFinishedPulling="2025-12-03 22:23:55.591246597 +0000 UTC m=+1124.587707946" observedRunningTime="2025-12-03 22:23:59.152863033 +0000 UTC m=+1128.149324382" watchObservedRunningTime="2025-12-03 22:23:59.17996017 +0000 UTC m=+1128.176421519" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.186231 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" podStartSLOduration=3.441833812 podStartE2EDuration="33.18620699s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.70125121 +0000 UTC m=+1097.697712559" lastFinishedPulling="2025-12-03 22:23:58.445624378 +0000 UTC m=+1127.442085737" observedRunningTime="2025-12-03 22:23:59.180458574 +0000 UTC m=+1128.176919923" watchObservedRunningTime="2025-12-03 22:23:59.18620699 +0000 UTC m=+1128.182668339" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.234047 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" podStartSLOduration=6.297486949 podStartE2EDuration="33.234031194s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.746908473 +0000 UTC m=+1097.743369822" lastFinishedPulling="2025-12-03 22:23:55.683452708 +0000 UTC m=+1124.679914067" observedRunningTime="2025-12-03 22:23:59.226258162 +0000 UTC m=+1128.222719521" watchObservedRunningTime="2025-12-03 22:23:59.234031194 +0000 UTC m=+1128.230492543" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.250039 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" podStartSLOduration=6.2392386 podStartE2EDuration="33.250031089s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.76045223 +0000 UTC m=+1097.756913569" lastFinishedPulling="2025-12-03 22:23:55.771244709 +0000 UTC m=+1124.767706058" observedRunningTime="2025-12-03 22:23:59.249197967 +0000 UTC m=+1128.245659316" watchObservedRunningTime="2025-12-03 22:23:59.250031089 +0000 UTC m=+1128.246492438" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.277557 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" podStartSLOduration=5.426466131 podStartE2EDuration="32.277543179s" podCreationTimestamp="2025-12-03 22:23:27 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.740054936 +0000 UTC m=+1097.736516285" lastFinishedPulling="2025-12-03 22:23:55.591131984 +0000 UTC m=+1124.587593333" observedRunningTime="2025-12-03 22:23:59.272793699 +0000 UTC m=+1128.269255048" watchObservedRunningTime="2025-12-03 22:23:59.277543179 +0000 UTC m=+1128.274004528" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.311381 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" podStartSLOduration=20.769748767 podStartE2EDuration="33.31136207s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:43.141667791 +0000 UTC m=+1112.138129140" lastFinishedPulling="2025-12-03 22:23:55.683281054 +0000 UTC m=+1124.679742443" observedRunningTime="2025-12-03 22:23:59.292554457 +0000 UTC m=+1128.289015806" watchObservedRunningTime="2025-12-03 22:23:59.31136207 +0000 UTC m=+1128.307823419" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.311995 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" podStartSLOduration=3.408926068 podStartE2EDuration="33.311988256s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.640888868 +0000 UTC m=+1097.637350217" lastFinishedPulling="2025-12-03 22:23:58.543951056 +0000 UTC m=+1127.540412405" observedRunningTime="2025-12-03 22:23:59.308411349 +0000 UTC m=+1128.304872718" watchObservedRunningTime="2025-12-03 22:23:59.311988256 +0000 UTC m=+1128.308449605" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.335248 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-rhhdr" podStartSLOduration=4.856729755 podStartE2EDuration="33.33522968s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.373259097 +0000 UTC m=+1097.369720446" lastFinishedPulling="2025-12-03 22:23:56.851759022 +0000 UTC m=+1125.848220371" observedRunningTime="2025-12-03 22:23:59.331720534 +0000 UTC m=+1128.328181873" watchObservedRunningTime="2025-12-03 22:23:59.33522968 +0000 UTC m=+1128.331691029" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.362964 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qnp24" podStartSLOduration=4.349051042 podStartE2EDuration="33.362942175s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:27.877607232 +0000 UTC m=+1096.874068581" lastFinishedPulling="2025-12-03 22:23:56.891498365 +0000 UTC m=+1125.887959714" observedRunningTime="2025-12-03 22:23:59.356611042 +0000 UTC m=+1128.353072391" watchObservedRunningTime="2025-12-03 22:23:59.362942175 +0000 UTC m=+1128.359403524" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.366271 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.382487 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qcz7k" podStartSLOduration=4.826504732 podStartE2EDuration="33.382465147s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.339521599 +0000 UTC m=+1097.335982948" lastFinishedPulling="2025-12-03 22:23:56.895482024 +0000 UTC m=+1125.891943363" observedRunningTime="2025-12-03 22:23:59.377218964 +0000 UTC m=+1128.373680343" watchObservedRunningTime="2025-12-03 22:23:59.382465147 +0000 UTC m=+1128.378926496" Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.639774 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h"] Dec 03 22:23:59 crc kubenswrapper[4830]: W1203 22:23:59.651107 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1112c1_ffce_45e1_94a4_3aad2ae50fe5.slice/crio-a0fe82ad6748f45a1e294fa5a96d71e0fe1b6d0b94d76f35ca4e7931e95a8e13 WatchSource:0}: Error finding container a0fe82ad6748f45a1e294fa5a96d71e0fe1b6d0b94d76f35ca4e7931e95a8e13: Status 404 returned error can't find the container with id a0fe82ad6748f45a1e294fa5a96d71e0fe1b6d0b94d76f35ca4e7931e95a8e13 Dec 03 22:23:59 crc kubenswrapper[4830]: W1203 22:23:59.793840 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c670280_553c_4251_ac28_04fdd66313a7.slice/crio-e8b6c709c28d139f099fc76d8bd9e95cfe5341780abe5f436dab15c440ff3922 WatchSource:0}: Error finding container e8b6c709c28d139f099fc76d8bd9e95cfe5341780abe5f436dab15c440ff3922: Status 404 returned error can't find the container with id e8b6c709c28d139f099fc76d8bd9e95cfe5341780abe5f436dab15c440ff3922 Dec 03 22:23:59 crc kubenswrapper[4830]: I1203 22:23:59.795911 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp"] Dec 03 22:24:00 crc kubenswrapper[4830]: I1203 22:24:00.156859 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" event={"ID":"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5","Type":"ContainerStarted","Data":"a0fe82ad6748f45a1e294fa5a96d71e0fe1b6d0b94d76f35ca4e7931e95a8e13"} Dec 03 22:24:00 crc kubenswrapper[4830]: I1203 22:24:00.159178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" event={"ID":"0c670280-553c-4251-ac28-04fdd66313a7","Type":"ContainerStarted","Data":"e8b6c709c28d139f099fc76d8bd9e95cfe5341780abe5f436dab15c440ff3922"} Dec 03 22:24:00 crc kubenswrapper[4830]: I1203 22:24:00.161827 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:24:01 crc kubenswrapper[4830]: I1203 22:24:01.196399 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" event={"ID":"55d8936f-55fc-4a92-b8a2-c393b6b46eeb","Type":"ContainerStarted","Data":"749ad2144d9ff6c14dbecb2e777ab556c472aae6724ffdbb664ce842f40b0823"} Dec 03 22:24:01 crc kubenswrapper[4830]: I1203 22:24:01.197214 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:24:01 crc kubenswrapper[4830]: I1203 22:24:01.199315 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" event={"ID":"0c670280-553c-4251-ac28-04fdd66313a7","Type":"ContainerStarted","Data":"e8c53aafc94e8249a57da2b059ec58a624766d80326593d7ebe999546e0d6931"} Dec 03 22:24:01 crc kubenswrapper[4830]: I1203 22:24:01.225762 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" podStartSLOduration=4.396670652 podStartE2EDuration="35.225745536s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:28.679734125 +0000 UTC m=+1097.676195474" lastFinishedPulling="2025-12-03 22:23:59.508809009 +0000 UTC m=+1128.505270358" observedRunningTime="2025-12-03 22:24:01.213902983 +0000 UTC m=+1130.210364322" watchObservedRunningTime="2025-12-03 22:24:01.225745536 +0000 UTC m=+1130.222206885" Dec 03 22:24:01 crc kubenswrapper[4830]: I1203 22:24:01.244658 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" podStartSLOduration=34.24463631 podStartE2EDuration="34.24463631s" podCreationTimestamp="2025-12-03 22:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:01.243018917 +0000 UTC m=+1130.239480266" watchObservedRunningTime="2025-12-03 22:24:01.24463631 +0000 UTC m=+1130.241097669" Dec 03 22:24:02 crc kubenswrapper[4830]: I1203 22:24:02.210969 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:24:02 crc kubenswrapper[4830]: I1203 22:24:02.602699 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-pd68x" Dec 03 22:24:03 crc kubenswrapper[4830]: I1203 22:24:03.226224 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" event={"ID":"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5","Type":"ContainerStarted","Data":"2e1cd199be7171a6e507db8683f64917cca41ef0f26ead1944426b5be65dcb85"} Dec 03 22:24:03 crc kubenswrapper[4830]: I1203 22:24:03.226283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" event={"ID":"7d1112c1-ffce-45e1-94a4-3aad2ae50fe5","Type":"ContainerStarted","Data":"444f2817d718a6f54962bfa7ff626b543ae37dc124947cfc10af9b62c21d36a3"} Dec 03 22:24:03 crc kubenswrapper[4830]: I1203 22:24:03.226318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:24:03 crc kubenswrapper[4830]: I1203 22:24:03.277923 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" podStartSLOduration=34.883322688 podStartE2EDuration="37.277901245s" podCreationTimestamp="2025-12-03 22:23:26 +0000 UTC" firstStartedPulling="2025-12-03 22:23:59.653952591 +0000 UTC m=+1128.650413960" lastFinishedPulling="2025-12-03 22:24:02.048531168 +0000 UTC m=+1131.044992517" observedRunningTime="2025-12-03 22:24:03.274960955 +0000 UTC m=+1132.271422334" watchObservedRunningTime="2025-12-03 22:24:03.277901245 +0000 UTC m=+1132.274362634" Dec 03 22:24:06 crc kubenswrapper[4830]: I1203 22:24:06.938955 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-nggxg" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.212352 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4fshq" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.244415 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-8wlrd" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.260466 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-48cc9" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.293911 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-526ng" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.414236 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rs6wd" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.743198 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-7bhdq" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.780918 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-59779d887b-2cqbq" Dec 03 22:24:07 crc kubenswrapper[4830]: I1203 22:24:07.791058 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-dtdgr" Dec 03 22:24:09 crc kubenswrapper[4830]: I1203 22:24:09.140021 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h" Dec 03 22:24:09 crc kubenswrapper[4830]: I1203 22:24:09.374055 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6bb8cf96cb-6vrpp" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.648355 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.661194 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.664339 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vthmk" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.664732 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.664914 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.665158 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.671223 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.716935 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.718295 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.722352 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.732384 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.740259 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7f4l\" (UniqueName: \"kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.740362 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.841953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.842241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.842260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfbv\" (UniqueName: \"kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.842278 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.842324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7f4l\" (UniqueName: \"kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.842952 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.868964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7f4l\" (UniqueName: \"kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l\") pod \"dnsmasq-dns-675f4bcbfc-ldj7f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.944775 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfbv\" (UniqueName: \"kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.944809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.944829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.945777 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.945778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.964065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfbv\" (UniqueName: \"kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv\") pod \"dnsmasq-dns-78dd6ddcc-nl9bz\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:26 crc kubenswrapper[4830]: I1203 22:24:26.987171 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:24:27 crc kubenswrapper[4830]: I1203 22:24:27.032221 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:24:27 crc kubenswrapper[4830]: I1203 22:24:27.440944 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:24:27 crc kubenswrapper[4830]: I1203 22:24:27.456126 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" event={"ID":"588e3fa1-4978-4c0b-83b9-258582e0b06f","Type":"ContainerStarted","Data":"d5815188f63097d5c5328e33672ef9db4758b6d204197a16263cae7d6491f401"} Dec 03 22:24:27 crc kubenswrapper[4830]: I1203 22:24:27.518501 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:24:27 crc kubenswrapper[4830]: W1203 22:24:27.521956 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd64b8e_d074_4720_924b_2ee99744a69a.slice/crio-0e4dd14b9ec130c6417bc438586f2d04b4699443204ee05dc0a295efd1716461 WatchSource:0}: Error finding container 0e4dd14b9ec130c6417bc438586f2d04b4699443204ee05dc0a295efd1716461: Status 404 returned error can't find the container with id 0e4dd14b9ec130c6417bc438586f2d04b4699443204ee05dc0a295efd1716461 Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.482639 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" event={"ID":"edd64b8e-d074-4720-924b-2ee99744a69a","Type":"ContainerStarted","Data":"0e4dd14b9ec130c6417bc438586f2d04b4699443204ee05dc0a295efd1716461"} Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.753973 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.792143 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.793691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.801155 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.902528 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.902827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhj57\" (UniqueName: \"kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:28 crc kubenswrapper[4830]: I1203 22:24:28.903053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.004687 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.004745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.004796 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhj57\" (UniqueName: \"kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.005738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.005757 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.028357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhj57\" (UniqueName: \"kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57\") pod \"dnsmasq-dns-5ccc8479f9-9lj9s\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.091326 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.112526 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.115006 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.116258 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.124404 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.207703 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.207825 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gq5w\" (UniqueName: \"kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.207856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.308719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.308793 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gq5w\" (UniqueName: \"kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.308818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.309714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.309965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.327530 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gq5w\" (UniqueName: \"kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w\") pod \"dnsmasq-dns-57d769cc4f-vjkwl\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.431120 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.561949 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:24:29 crc kubenswrapper[4830]: W1203 22:24:29.566151 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a6ed877_b9b1_4c85_b587_f98928d02441.slice/crio-dd8d6f31d5b69c7906e07db851c527783bfcd68d330cff2ff49e2d8a02fcbec0 WatchSource:0}: Error finding container dd8d6f31d5b69c7906e07db851c527783bfcd68d330cff2ff49e2d8a02fcbec0: Status 404 returned error can't find the container with id dd8d6f31d5b69c7906e07db851c527783bfcd68d330cff2ff49e2d8a02fcbec0 Dec 03 22:24:29 crc kubenswrapper[4830]: W1203 22:24:29.881461 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303014fe_7aad_489f_9fd9_51d7f373325e.slice/crio-b00fbeed1ab06dfeae0b8e6a69ba661cae89803401f47a9ec428d70d45170753 WatchSource:0}: Error finding container b00fbeed1ab06dfeae0b8e6a69ba661cae89803401f47a9ec428d70d45170753: Status 404 returned error can't find the container with id b00fbeed1ab06dfeae0b8e6a69ba661cae89803401f47a9ec428d70d45170753 Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.882300 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.941325 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.943658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.949596 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jjng6" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.949616 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.949789 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.949824 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.960413 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.992395 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.992740 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 22:24:29 crc kubenswrapper[4830]: I1203 22:24:29.993147 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.120133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.120221 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.120637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.120835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.120980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121742 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9qq\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.121867 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.225374 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.223782 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.226010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.226085 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.226162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228131 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228244 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228382 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228468 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228503 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228581 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9qq\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.228682 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.231999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.233172 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.236683 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.236841 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.236917 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/856d33f0f69f6ddab83e3004572787ee6d83b55286f5e7938a47fe9f0c93e013/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.238481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.240905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.240922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.265408 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9qq\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.281397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.307901 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.508772 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" event={"ID":"1a6ed877-b9b1-4c85-b587-f98928d02441","Type":"ContainerStarted","Data":"dd8d6f31d5b69c7906e07db851c527783bfcd68d330cff2ff49e2d8a02fcbec0"} Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.513315 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" event={"ID":"303014fe-7aad-489f-9fd9-51d7f373325e","Type":"ContainerStarted","Data":"b00fbeed1ab06dfeae0b8e6a69ba661cae89803401f47a9ec428d70d45170753"} Dec 03 22:24:30 crc kubenswrapper[4830]: I1203 22:24:30.585397 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.284900 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.287613 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.289659 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.292895 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.294118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.294363 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.294500 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.294764 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d8d68" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.294984 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.295148 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350771 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350795 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnftw\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350829 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350904 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.350982 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.351076 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.351101 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.351135 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.351171 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456023 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456127 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456155 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456276 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnftw\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456330 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456360 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456465 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456534 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.456573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.457382 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.457779 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.457870 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.458051 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.458316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.461890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.462117 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.464280 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.466498 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.466563 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e508f47409051aeeca58235538460d6ef03a80bec1be13d5a3f95b5d169f15a1/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.471648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.476831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnftw\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.496920 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " pod="openstack/rabbitmq-server-0" Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.527722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerStarted","Data":"c3c73ec921fa2aa67e815b9cc1f0737a0b8860fa25edfe6a9c662b7c45cfea09"} Dec 03 22:24:31 crc kubenswrapper[4830]: I1203 22:24:31.657783 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:24:32 crc kubenswrapper[4830]: I1203 22:24:32.130461 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:24:32 crc kubenswrapper[4830]: I1203 22:24:32.559736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerStarted","Data":"ba29a15596d43c544ee5fb22add28ad2e5393a7cf2b852dab02453018ff54652"} Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.231641 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.233135 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.238135 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.238203 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xvrsc" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.238438 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.238463 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.246811 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.249781 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.396640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398421 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcxf\" (UniqueName: \"kubernetes.io/projected/21e1ac03-6466-4663-bff2-68ff2cc7801d-kube-api-access-6wcxf\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398520 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398558 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398574 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.398627 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500429 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcxf\" (UniqueName: \"kubernetes.io/projected/21e1ac03-6466-4663-bff2-68ff2cc7801d-kube-api-access-6wcxf\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500582 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500670 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500688 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.500714 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.503619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.504349 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.504795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.507820 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.507855 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b9cafd594587ac1963cd348c766ea167e0a836b7be8722b62eea5a621d01235/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.509178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e1ac03-6466-4663-bff2-68ff2cc7801d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.522131 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.524586 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21e1ac03-6466-4663-bff2-68ff2cc7801d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.530133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcxf\" (UniqueName: \"kubernetes.io/projected/21e1ac03-6466-4663-bff2-68ff2cc7801d-kube-api-access-6wcxf\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.549579 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.550651 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.553842 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rvmqn" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.553956 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.553845 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.570098 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.573631 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.579322 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.579644 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-rm2f7" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.579773 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.579776 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.586825 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.611332 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bb16df7-75a5-4c27-be2d-be822b5149f8\") pod \"openstack-cell1-galera-0\" (UID: \"21e1ac03-6466-4663-bff2-68ff2cc7801d\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.613286 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709453 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709537 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2klv\" (UniqueName: \"kubernetes.io/projected/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kube-api-access-j2klv\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2rg\" (UniqueName: \"kubernetes.io/projected/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kube-api-access-fz2rg\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709634 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709681 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709701 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709734 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kolla-config\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709768 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709786 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.709802 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-config-data\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811006 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811068 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2klv\" (UniqueName: \"kubernetes.io/projected/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kube-api-access-j2klv\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811161 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2rg\" (UniqueName: \"kubernetes.io/projected/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kube-api-access-fz2rg\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811178 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811218 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811273 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kolla-config\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.811341 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-config-data\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.812081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.812218 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-config-data\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.812319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-config-data-default\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.812642 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kolla-config\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.812959 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kolla-config\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.813600 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bd20b1-76d6-4238-be14-1c5891d6bbd8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.816713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.816940 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.816972 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9646269b0f217a6a5430c22073320b1bbbb7701de47bb994c5c7fb0e83c85e3c/globalmount\"" pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.818966 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.819291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.819779 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bd20b1-76d6-4238-be14-1c5891d6bbd8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.830891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2klv\" (UniqueName: \"kubernetes.io/projected/85bd20b1-76d6-4238-be14-1c5891d6bbd8-kube-api-access-j2klv\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.831322 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2rg\" (UniqueName: \"kubernetes.io/projected/deb3672e-3fb5-4549-ae27-6f7402c1e3d8-kube-api-access-fz2rg\") pod \"memcached-0\" (UID: \"deb3672e-3fb5-4549-ae27-6f7402c1e3d8\") " pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.862953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.863412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-162fc418-d645-4bcc-ac8d-8fa989af1696\") pod \"openstack-galera-0\" (UID: \"85bd20b1-76d6-4238-be14-1c5891d6bbd8\") " pod="openstack/openstack-galera-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.908249 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 22:24:33 crc kubenswrapper[4830]: I1203 22:24:33.919236 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 22:24:34 crc kubenswrapper[4830]: I1203 22:24:34.484986 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:24:34 crc kubenswrapper[4830]: W1203 22:24:34.495657 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e1ac03_6466_4663_bff2_68ff2cc7801d.slice/crio-528c007827ce46d72fb7b38e86ecf4e24a09035ee87c657fe21af44351fd4477 WatchSource:0}: Error finding container 528c007827ce46d72fb7b38e86ecf4e24a09035ee87c657fe21af44351fd4477: Status 404 returned error can't find the container with id 528c007827ce46d72fb7b38e86ecf4e24a09035ee87c657fe21af44351fd4477 Dec 03 22:24:34 crc kubenswrapper[4830]: I1203 22:24:34.577020 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 22:24:34 crc kubenswrapper[4830]: W1203 22:24:34.584613 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb3672e_3fb5_4549_ae27_6f7402c1e3d8.slice/crio-b5fe693921d09f0bf6a80ba0914dae5bb713647dbd4c3bc2a829716fdd5a22db WatchSource:0}: Error finding container b5fe693921d09f0bf6a80ba0914dae5bb713647dbd4c3bc2a829716fdd5a22db: Status 404 returned error can't find the container with id b5fe693921d09f0bf6a80ba0914dae5bb713647dbd4c3bc2a829716fdd5a22db Dec 03 22:24:34 crc kubenswrapper[4830]: I1203 22:24:34.593529 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"deb3672e-3fb5-4549-ae27-6f7402c1e3d8","Type":"ContainerStarted","Data":"b5fe693921d09f0bf6a80ba0914dae5bb713647dbd4c3bc2a829716fdd5a22db"} Dec 03 22:24:34 crc kubenswrapper[4830]: I1203 22:24:34.594831 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21e1ac03-6466-4663-bff2-68ff2cc7801d","Type":"ContainerStarted","Data":"528c007827ce46d72fb7b38e86ecf4e24a09035ee87c657fe21af44351fd4477"} Dec 03 22:24:34 crc kubenswrapper[4830]: I1203 22:24:34.653579 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:24:34 crc kubenswrapper[4830]: W1203 22:24:34.657207 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85bd20b1_76d6_4238_be14_1c5891d6bbd8.slice/crio-f9a49a6e09a91f86b3b05ee5c056d58cdb61b6333d9500ae707c9c12dad591f3 WatchSource:0}: Error finding container f9a49a6e09a91f86b3b05ee5c056d58cdb61b6333d9500ae707c9c12dad591f3: Status 404 returned error can't find the container with id f9a49a6e09a91f86b3b05ee5c056d58cdb61b6333d9500ae707c9c12dad591f3 Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.332406 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.333676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.349816 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jfmmq" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.358436 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.441033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5f4\" (UniqueName: \"kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4\") pod \"kube-state-metrics-0\" (UID: \"443cf1a9-f7ab-413e-bddf-08978b24fc87\") " pod="openstack/kube-state-metrics-0" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.542441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td5f4\" (UniqueName: \"kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4\") pod \"kube-state-metrics-0\" (UID: \"443cf1a9-f7ab-413e-bddf-08978b24fc87\") " pod="openstack/kube-state-metrics-0" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.596537 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td5f4\" (UniqueName: \"kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4\") pod \"kube-state-metrics-0\" (UID: \"443cf1a9-f7ab-413e-bddf-08978b24fc87\") " pod="openstack/kube-state-metrics-0" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.622711 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"85bd20b1-76d6-4238-be14-1c5891d6bbd8","Type":"ContainerStarted","Data":"f9a49a6e09a91f86b3b05ee5c056d58cdb61b6333d9500ae707c9c12dad591f3"} Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.664841 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:24:35 crc kubenswrapper[4830]: I1203 22:24:35.998479 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.000225 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.002872 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.003026 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.003045 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-8c4pl" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.003130 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.003407 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.008654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.058442 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.058480 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.058542 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.058842 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.058905 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.059012 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.059033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl64q\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-kube-api-access-hl64q\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.159877 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.159963 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.159989 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.160021 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.160044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl64q\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-kube-api-access-hl64q\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.160162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.160183 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.163929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.164379 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.167093 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd27b153-5334-4329-91de-0e6941ae9e97-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.168070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.170061 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd27b153-5334-4329-91de-0e6941ae9e97-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.175149 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.179089 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl64q\" (UniqueName: \"kubernetes.io/projected/cd27b153-5334-4329-91de-0e6941ae9e97-kube-api-access-hl64q\") pod \"alertmanager-metric-storage-0\" (UID: \"cd27b153-5334-4329-91de-0e6941ae9e97\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.335661 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.662760 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.669454 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.678873 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679057 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679268 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kmlv2" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679091 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679120 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679392 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.679497 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771347 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771365 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.771477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vt2\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.872963 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vt2\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873042 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873165 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873218 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.873271 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.874446 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.877700 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.877737 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/76467b962fce3d483edcc8f8bee5aee42e3f124190cf9bb015fb947a94c7c8bb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.878144 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.880766 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.885185 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.886937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.890492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vt2\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.899350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:36 crc kubenswrapper[4830]: I1203 22:24:36.920973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:37 crc kubenswrapper[4830]: I1203 22:24:37.001259 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.423712 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nlxm7"] Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.425181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.430365 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.430576 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zvbnk" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.430812 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.442029 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlxm7"] Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.496776 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8j27r"] Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.498615 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.505675 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8j27r"] Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.533932 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-log-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.533975 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.534029 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-combined-ca-bundle\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.534146 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.534231 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-ovn-controller-tls-certs\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.534274 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtjb\" (UniqueName: \"kubernetes.io/projected/6bdf507c-05be-4df8-8c33-85f15c05237c-kube-api-access-qjtjb\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.534318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bdf507c-05be-4df8-8c33-85f15c05237c-scripts\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bdf507c-05be-4df8-8c33-85f15c05237c-scripts\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636196 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-log\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-log-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636309 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7ds\" (UniqueName: \"kubernetes.io/projected/b0d02b19-e65a-4c45-b658-a34c69cdf74e-kube-api-access-5x7ds\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636334 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636361 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-combined-ca-bundle\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636448 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d02b19-e65a-4c45-b658-a34c69cdf74e-scripts\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-lib\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636549 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-run\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636582 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-ovn-controller-tls-certs\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636603 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-etc-ovs\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.636634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtjb\" (UniqueName: \"kubernetes.io/projected/6bdf507c-05be-4df8-8c33-85f15c05237c-kube-api-access-qjtjb\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.638260 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-log-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.638888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run-ovn\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.639074 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bdf507c-05be-4df8-8c33-85f15c05237c-var-run\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.642999 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bdf507c-05be-4df8-8c33-85f15c05237c-scripts\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.662908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-combined-ca-bundle\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.666707 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtjb\" (UniqueName: \"kubernetes.io/projected/6bdf507c-05be-4df8-8c33-85f15c05237c-kube-api-access-qjtjb\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.686264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bdf507c-05be-4df8-8c33-85f15c05237c-ovn-controller-tls-certs\") pod \"ovn-controller-nlxm7\" (UID: \"6bdf507c-05be-4df8-8c33-85f15c05237c\") " pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7ds\" (UniqueName: \"kubernetes.io/projected/b0d02b19-e65a-4c45-b658-a34c69cdf74e-kube-api-access-5x7ds\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738522 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d02b19-e65a-4c45-b658-a34c69cdf74e-scripts\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-lib\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738606 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-run\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738637 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-etc-ovs\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-log\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738924 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-log\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-lib\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.738997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-var-run\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.739138 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b0d02b19-e65a-4c45-b658-a34c69cdf74e-etc-ovs\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.740696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d02b19-e65a-4c45-b658-a34c69cdf74e-scripts\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.746257 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.777736 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7ds\" (UniqueName: \"kubernetes.io/projected/b0d02b19-e65a-4c45-b658-a34c69cdf74e-kube-api-access-5x7ds\") pod \"ovn-controller-ovs-8j27r\" (UID: \"b0d02b19-e65a-4c45-b658-a34c69cdf74e\") " pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:39 crc kubenswrapper[4830]: I1203 22:24:39.814043 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.338538 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.339946 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.343646 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.343910 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.344053 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j6md6" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.344181 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.344317 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.350081 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.449828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-config\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.449904 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.449933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.449957 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.449976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspws\" (UniqueName: \"kubernetes.io/projected/63965380-d86f-4abf-9c9c-4d5a25ad6754-kube-api-access-jspws\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.450033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.450064 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.450109 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551563 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551636 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-config\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551727 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551757 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspws\" (UniqueName: \"kubernetes.io/projected/63965380-d86f-4abf-9c9c-4d5a25ad6754-kube-api-access-jspws\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.551875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.553427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-config\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.554357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.554958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63965380-d86f-4abf-9c9c-4d5a25ad6754-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.556952 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.556990 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/15e0fd3ac24c46a15af66b23aee01c06e34d33849dca7d586b1056a15a81b66a/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.558039 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.559099 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.573275 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63965380-d86f-4abf-9c9c-4d5a25ad6754-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.573862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspws\" (UniqueName: \"kubernetes.io/projected/63965380-d86f-4abf-9c9c-4d5a25ad6754-kube-api-access-jspws\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.603589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1085917-1e37-44d3-bb87-79f567b4c67c\") pod \"ovsdbserver-nb-0\" (UID: \"63965380-d86f-4abf-9c9c-4d5a25ad6754\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:40 crc kubenswrapper[4830]: I1203 22:24:40.660370 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.179027 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.180362 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.182574 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.183096 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.183244 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.183356 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.186481 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-lwlr4" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.199461 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.333160 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.333204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.333269 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.333310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.333344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97rj\" (UniqueName: \"kubernetes.io/projected/596170cd-57e9-4665-947d-ddb1549a38e0-kube-api-access-f97rj\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.401842 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-xntf7"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.402855 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.409818 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.414171 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.414716 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.425795 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-xntf7"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.434351 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.434393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.434475 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.434570 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.434615 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97rj\" (UniqueName: \"kubernetes.io/projected/596170cd-57e9-4665-947d-ddb1549a38e0-kube-api-access-f97rj\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.437166 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.438943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596170cd-57e9-4665-947d-ddb1549a38e0-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.448034 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.458884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/596170cd-57e9-4665-947d-ddb1549a38e0-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.483741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97rj\" (UniqueName: \"kubernetes.io/projected/596170cd-57e9-4665-947d-ddb1549a38e0-kube-api-access-f97rj\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gvmnk\" (UID: \"596170cd-57e9-4665-947d-ddb1549a38e0\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.501378 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.520338 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.524165 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.532366 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.532593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.532695 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.533315 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dkz6d" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536354 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536384 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-config\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536564 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbtgh\" (UniqueName: \"kubernetes.io/projected/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-kube-api-access-xbtgh\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536626 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.536769 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.552975 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.554242 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.556044 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.556255 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.563565 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.635067 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kvm\" (UniqueName: \"kubernetes.io/projected/de378972-d74f-44fe-a727-19bde47f0cbe-kube-api-access-n5kvm\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638392 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-config\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638415 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbtgh\" (UniqueName: \"kubernetes.io/projected/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-kube-api-access-xbtgh\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638484 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638530 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638551 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-87e4905b-fab9-430e-803b-33e832b8649f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e4905b-fab9-430e-803b-33e832b8649f\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s874p\" (UniqueName: \"kubernetes.io/projected/c212e9c4-4562-48b2-9be8-bf00f52a076a-kube-api-access-s874p\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638628 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638674 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638698 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638742 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-config\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638764 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638789 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.638826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.642445 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.642602 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-config\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.671357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.671408 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.672023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.719230 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbtgh\" (UniqueName: \"kubernetes.io/projected/2d6f2070-f2d3-47d2-b43f-dfdaed23e03b-kube-api-access-xbtgh\") pod \"cloudkitty-lokistack-querier-548665d79b-xntf7\" (UID: \"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b\") " pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.732919 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.766942 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.771832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.771988 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772122 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772180 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kvm\" (UniqueName: \"kubernetes.io/projected/de378972-d74f-44fe-a727-19bde47f0cbe-kube-api-access-n5kvm\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772566 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-config\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772618 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772923 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-87e4905b-fab9-430e-803b-33e832b8649f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e4905b-fab9-430e-803b-33e832b8649f\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.772978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s874p\" (UniqueName: \"kubernetes.io/projected/c212e9c4-4562-48b2-9be8-bf00f52a076a-kube-api-access-s874p\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.785795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.787023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.801697 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c212e9c4-4562-48b2-9be8-bf00f52a076a-config\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.802347 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.804360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de378972-d74f-44fe-a727-19bde47f0cbe-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.806801 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.810141 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.820256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.825133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kvm\" (UniqueName: \"kubernetes.io/projected/de378972-d74f-44fe-a727-19bde47f0cbe-kube-api-access-n5kvm\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.863081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c212e9c4-4562-48b2-9be8-bf00f52a076a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.863647 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.864102 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.864541 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s874p\" (UniqueName: \"kubernetes.io/projected/c212e9c4-4562-48b2-9be8-bf00f52a076a-kube-api-access-s874p\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.864609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-5vbg4" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.865254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/de378972-d74f-44fe-a727-19bde47f0cbe-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-qg9t9\" (UID: \"de378972-d74f-44fe-a727-19bde47f0cbe\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.869056 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.869412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.869833 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.870012 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.870149 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.873943 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.873992 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-87e4905b-fab9-430e-803b-33e832b8649f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e4905b-fab9-430e-803b-33e832b8649f\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44d8d5ad8c95c055672b124c29b329058d126886dadddd00d3904fb2f54df920/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878491 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878573 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878638 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9l8x\" (UniqueName: \"kubernetes.io/projected/800e6ad6-526b-4134-b759-b9c0d884e3f5-kube-api-access-k9l8x\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878740 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.878760 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.887810 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.900319 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.901481 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.911434 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.919423 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t"] Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.927153 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-87e4905b-fab9-430e-803b-33e832b8649f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e4905b-fab9-430e-803b-33e832b8649f\") pod \"ovsdbserver-sb-0\" (UID: \"c212e9c4-4562-48b2-9be8-bf00f52a076a\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980690 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980724 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980767 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980789 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980845 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980917 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980945 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9sgg\" (UniqueName: \"kubernetes.io/projected/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-kube-api-access-n9sgg\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980972 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.980990 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.981006 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.981035 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.982124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.982663 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.983624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9l8x\" (UniqueName: \"kubernetes.io/projected/800e6ad6-526b-4134-b759-b9c0d884e3f5-kube-api-access-k9l8x\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.984389 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.985139 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.985964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/800e6ad6-526b-4134-b759-b9c0d884e3f5-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.986066 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.987995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.990633 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/800e6ad6-526b-4134-b759-b9c0d884e3f5-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:44 crc kubenswrapper[4830]: I1203 22:24:44.998782 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9l8x\" (UniqueName: \"kubernetes.io/projected/800e6ad6-526b-4134-b759-b9c0d884e3f5-kube-api-access-k9l8x\") pod \"cloudkitty-lokistack-gateway-76cc998948-gbw74\" (UID: \"800e6ad6-526b-4134-b759-b9c0d884e3f5\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085237 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085317 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9sgg\" (UniqueName: \"kubernetes.io/projected/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-kube-api-access-n9sgg\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085365 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085413 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085493 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.085608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.086802 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.087465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.088043 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.088175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.088825 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.088890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.089212 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.089646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.106410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9sgg\" (UniqueName: \"kubernetes.io/projected/bb34bcb7-4a40-4d5b-a5ca-55571c61b999-kube-api-access-n9sgg\") pod \"cloudkitty-lokistack-gateway-76cc998948-tbz5t\" (UID: \"bb34bcb7-4a40-4d5b-a5ca-55571c61b999\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.184585 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.207323 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.224974 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.399099 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.400271 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.405954 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.406485 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.423773 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.488826 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.489953 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.492283 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.493849 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mwgk\" (UniqueName: \"kubernetes.io/projected/09564097-60ae-4b1d-bd03-ba8b5a254167-kube-api-access-7mwgk\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.493891 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.493914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.493939 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.493984 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.494012 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.494248 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.494446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.494482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.505397 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mwgk\" (UniqueName: \"kubernetes.io/projected/09564097-60ae-4b1d-bd03-ba8b5a254167-kube-api-access-7mwgk\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596687 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596760 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596810 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.596977 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.597057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.597136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.597343 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.599872 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.600377 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.601659 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.607415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.610576 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.617460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.621176 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.622817 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.623883 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.624291 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.625325 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/09564097-60ae-4b1d-bd03-ba8b5a254167-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.634586 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mwgk\" (UniqueName: \"kubernetes.io/projected/09564097-60ae-4b1d-bd03-ba8b5a254167-kube-api-access-7mwgk\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.638319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.640856 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"09564097-60ae-4b1d-bd03-ba8b5a254167\") " pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698233 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698494 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jm8\" (UniqueName: \"kubernetes.io/projected/83a885d2-eea8-4a2c-83d7-a0a945597421-kube-api-access-r2jm8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.698743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.736632 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.800924 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqkh\" (UniqueName: \"kubernetes.io/projected/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-kube-api-access-9mqkh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801061 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801112 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801184 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801263 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.801464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jm8\" (UniqueName: \"kubernetes.io/projected/83a885d2-eea8-4a2c-83d7-a0a945597421-kube-api-access-r2jm8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.802107 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.802238 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.803263 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a885d2-eea8-4a2c-83d7-a0a945597421-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.806276 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.806679 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.808231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/83a885d2-eea8-4a2c-83d7-a0a945597421-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.826359 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jm8\" (UniqueName: \"kubernetes.io/projected/83a885d2-eea8-4a2c-83d7-a0a945597421-kube-api-access-r2jm8\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.841400 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"83a885d2-eea8-4a2c-83d7-a0a945597421\") " pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903265 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903319 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqkh\" (UniqueName: \"kubernetes.io/projected/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-kube-api-access-9mqkh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.903526 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.904392 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.904517 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.905555 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.908559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.909974 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.921339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqkh\" (UniqueName: \"kubernetes.io/projected/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-kube-api-access-9mqkh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.922255 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/5af9b96f-fb0a-482b-9000-3b76a8c6c07c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.931782 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"5af9b96f-fb0a-482b-9000-3b76a8c6c07c\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:45 crc kubenswrapper[4830]: I1203 22:24:45.999924 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:24:46 crc kubenswrapper[4830]: I1203 22:24:46.120041 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:24:56 crc kubenswrapper[4830]: I1203 22:24:56.681695 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:24:57 crc kubenswrapper[4830]: I1203 22:24:56.682466 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.095472 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.096340 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jl9qq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(3fc13f96-b9cf-4e92-bbe6-2c3719041e59): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.097735 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.134031 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.134471 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnftw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(1d294aa0-bf67-4fc4-ad99-fda0ddd054d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.135843 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.892229 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.892394 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7f4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ldj7f_openstack(588e3fa1-4978-4c0b-83b9-258582e0b06f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:24:59 crc kubenswrapper[4830]: E1203 22:24:59.893570 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" podUID="588e3fa1-4978-4c0b-83b9-258582e0b06f" Dec 03 22:25:00 crc kubenswrapper[4830]: E1203 22:25:00.000636 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" Dec 03 22:25:00 crc kubenswrapper[4830]: E1203 22:25:00.000863 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" Dec 03 22:25:00 crc kubenswrapper[4830]: E1203 22:25:00.524405 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 03 22:25:00 crc kubenswrapper[4830]: E1203 22:25:00.524943 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n66dh56ch64hb9h6bh54fh7bh676hdh5bbh8chd4h588h694h574h5h5d6h86h657h57dh9bh6dh5c9h5f6hbh65dh545h656h559h59ch7ch5ccq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz2rg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(deb3672e-3fb5-4549-ae27-6f7402c1e3d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:00 crc kubenswrapper[4830]: E1203 22:25:00.526136 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="deb3672e-3fb5-4549-ae27-6f7402c1e3d8" Dec 03 22:25:01 crc kubenswrapper[4830]: E1203 22:25:01.009488 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="deb3672e-3fb5-4549-ae27-6f7402c1e3d8" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.332702 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.332850 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2klv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(85bd20b1-76d6-4238-be14-1c5891d6bbd8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.334016 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="85bd20b1-76d6-4238-be14-1c5891d6bbd8" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.338860 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.339003 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wcxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(21e1ac03-6466-4663-bff2-68ff2cc7801d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.340235 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="21e1ac03-6466-4663-bff2-68ff2cc7801d" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.378792 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.378962 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sfbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-nl9bz_openstack(edd64b8e-d074-4720-924b-2ee99744a69a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.380326 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" podUID="edd64b8e-d074-4720-924b-2ee99744a69a" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.510186 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.510497 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhj57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-9lj9s_openstack(1a6ed877-b9b1-4c85-b587-f98928d02441): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.511839 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.523111 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.523336 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gq5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-vjkwl_openstack(303014fe-7aad-489f-9fd9-51d7f373325e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:25:02 crc kubenswrapper[4830]: E1203 22:25:02.524553 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.702332 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.860576 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config\") pod \"588e3fa1-4978-4c0b-83b9-258582e0b06f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.860691 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7f4l\" (UniqueName: \"kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l\") pod \"588e3fa1-4978-4c0b-83b9-258582e0b06f\" (UID: \"588e3fa1-4978-4c0b-83b9-258582e0b06f\") " Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.862157 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config" (OuterVolumeSpecName: "config") pod "588e3fa1-4978-4c0b-83b9-258582e0b06f" (UID: "588e3fa1-4978-4c0b-83b9-258582e0b06f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.867810 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l" (OuterVolumeSpecName: "kube-api-access-p7f4l") pod "588e3fa1-4978-4c0b-83b9-258582e0b06f" (UID: "588e3fa1-4978-4c0b-83b9-258582e0b06f"). InnerVolumeSpecName "kube-api-access-p7f4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.963495 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588e3fa1-4978-4c0b-83b9-258582e0b06f-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:02 crc kubenswrapper[4830]: I1203 22:25:02.963581 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7f4l\" (UniqueName: \"kubernetes.io/projected/588e3fa1-4978-4c0b-83b9-258582e0b06f-kube-api-access-p7f4l\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.021877 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.021916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ldj7f" event={"ID":"588e3fa1-4978-4c0b-83b9-258582e0b06f","Type":"ContainerDied","Data":"d5815188f63097d5c5328e33672ef9db4758b6d204197a16263cae7d6491f401"} Dec 03 22:25:03 crc kubenswrapper[4830]: E1203 22:25:03.023579 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" Dec 03 22:25:03 crc kubenswrapper[4830]: E1203 22:25:03.027001 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="85bd20b1-76d6-4238-be14-1c5891d6bbd8" Dec 03 22:25:03 crc kubenswrapper[4830]: E1203 22:25:03.027084 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" Dec 03 22:25:03 crc kubenswrapper[4830]: E1203 22:25:03.027160 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="21e1ac03-6466-4663-bff2-68ff2cc7801d" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.205081 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: W1203 22:25:03.209189 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38be206_c963_42f9_834d_a9263b18cbed.slice/crio-822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9 WatchSource:0}: Error finding container 822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9: Status 404 returned error can't find the container with id 822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9 Dec 03 22:25:03 crc kubenswrapper[4830]: W1203 22:25:03.211558 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd27b153_5334_4329_91de_0e6941ae9e97.slice/crio-2a80b79a9f5e62caeb7850b0e583d4f886597661cfcaa75603f571c124e87040 WatchSource:0}: Error finding container 2a80b79a9f5e62caeb7850b0e583d4f886597661cfcaa75603f571c124e87040: Status 404 returned error can't find the container with id 2a80b79a9f5e62caeb7850b0e583d4f886597661cfcaa75603f571c124e87040 Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.219065 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.225794 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ldj7f"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.232617 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.239143 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.356962 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588e3fa1-4978-4c0b-83b9-258582e0b06f" path="/var/lib/kubelet/pods/588e3fa1-4978-4c0b-83b9-258582e0b06f/volumes" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.357620 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.625632 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.640596 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.652243 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.655581 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk"] Dec 03 22:25:03 crc kubenswrapper[4830]: W1203 22:25:03.674885 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bdf507c_05be_4df8_8c33_85f15c05237c.slice/crio-17c8d5a0a909510e78d3b2ccd265cd63171c763219403596d2ced6bb7e7e1fc5 WatchSource:0}: Error finding container 17c8d5a0a909510e78d3b2ccd265cd63171c763219403596d2ced6bb7e7e1fc5: Status 404 returned error can't find the container with id 17c8d5a0a909510e78d3b2ccd265cd63171c763219403596d2ced6bb7e7e1fc5 Dec 03 22:25:03 crc kubenswrapper[4830]: W1203 22:25:03.675110 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d6f2070_f2d3_47d2_b43f_dfdaed23e03b.slice/crio-ed362c432edadcc05084d7f24de2a1c966a0bd7fbf22c18b2339b597fb4bc48e WatchSource:0}: Error finding container ed362c432edadcc05084d7f24de2a1c966a0bd7fbf22c18b2339b597fb4bc48e: Status 404 returned error can't find the container with id ed362c432edadcc05084d7f24de2a1c966a0bd7fbf22c18b2339b597fb4bc48e Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.677428 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-548665d79b-xntf7"] Dec 03 22:25:03 crc kubenswrapper[4830]: W1203 22:25:03.684683 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af9b96f_fb0a_482b_9000_3b76a8c6c07c.slice/crio-048e01203d93e57fce04999af55500e6ce91cebba746e515d15273a5fe5436d2 WatchSource:0}: Error finding container 048e01203d93e57fce04999af55500e6ce91cebba746e515d15273a5fe5436d2: Status 404 returned error can't find the container with id 048e01203d93e57fce04999af55500e6ce91cebba746e515d15273a5fe5436d2 Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.684733 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlxm7"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.701463 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.776222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config\") pod \"edd64b8e-d074-4720-924b-2ee99744a69a\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.776728 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfbv\" (UniqueName: \"kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv\") pod \"edd64b8e-d074-4720-924b-2ee99744a69a\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.776788 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc\") pod \"edd64b8e-d074-4720-924b-2ee99744a69a\" (UID: \"edd64b8e-d074-4720-924b-2ee99744a69a\") " Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.776828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config" (OuterVolumeSpecName: "config") pod "edd64b8e-d074-4720-924b-2ee99744a69a" (UID: "edd64b8e-d074-4720-924b-2ee99744a69a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.777363 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "edd64b8e-d074-4720-924b-2ee99744a69a" (UID: "edd64b8e-d074-4720-924b-2ee99744a69a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.777892 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.777918 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd64b8e-d074-4720-924b-2ee99744a69a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.785772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv" (OuterVolumeSpecName: "kube-api-access-5sfbv") pod "edd64b8e-d074-4720-924b-2ee99744a69a" (UID: "edd64b8e-d074-4720-924b-2ee99744a69a"). InnerVolumeSpecName "kube-api-access-5sfbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.853764 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8j27r"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.879890 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfbv\" (UniqueName: \"kubernetes.io/projected/edd64b8e-d074-4720-924b-2ee99744a69a-kube-api-access-5sfbv\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.914091 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.926621 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.932245 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t"] Dec 03 22:25:03 crc kubenswrapper[4830]: I1203 22:25:03.991920 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:25:04 crc kubenswrapper[4830]: W1203 22:25:04.020079 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83a885d2_eea8_4a2c_83d7_a0a945597421.slice/crio-09bc8394d067bb5fd528392b79d3eaa166fec5cc92216d0b49a7740518aac92b WatchSource:0}: Error finding container 09bc8394d067bb5fd528392b79d3eaa166fec5cc92216d0b49a7740518aac92b: Status 404 returned error can't find the container with id 09bc8394d067bb5fd528392b79d3eaa166fec5cc92216d0b49a7740518aac92b Dec 03 22:25:04 crc kubenswrapper[4830]: W1203 22:25:04.024539 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0d02b19_e65a_4c45_b658_a34c69cdf74e.slice/crio-327389d49140cc99b88969a3a7966a9fcfb769572f2ef83404d370b435d9a764 WatchSource:0}: Error finding container 327389d49140cc99b88969a3a7966a9fcfb769572f2ef83404d370b435d9a764: Status 404 returned error can't find the container with id 327389d49140cc99b88969a3a7966a9fcfb769572f2ef83404d370b435d9a764 Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.029452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" event={"ID":"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b","Type":"ContainerStarted","Data":"ed362c432edadcc05084d7f24de2a1c966a0bd7fbf22c18b2339b597fb4bc48e"} Dec 03 22:25:04 crc kubenswrapper[4830]: E1203 22:25:04.030809 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-ingester,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51,Command:[],Args:[-target=ingester -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:wal,ReadOnly:false,MountPath:/tmp/wal,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7mwgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-ingester-0_openstack(09564097-60ae-4b1d-bd03-ba8b5a254167): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.031240 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" event={"ID":"596170cd-57e9-4665-947d-ddb1549a38e0","Type":"ContainerStarted","Data":"d3a545b73ee77fb27051abf8ab6913a66122e5c69a1f3ca86bc61db5542a12b4"} Dec 03 22:25:04 crc kubenswrapper[4830]: E1203 22:25:04.032060 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.032417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" event={"ID":"de378972-d74f-44fe-a727-19bde47f0cbe","Type":"ContainerStarted","Data":"f52c1c390dc7eb2a07c05c2b4e7d400760df3a888e104113179cd6432e7df5d1"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.034098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cd27b153-5334-4329-91de-0e6941ae9e97","Type":"ContainerStarted","Data":"2a80b79a9f5e62caeb7850b0e583d4f886597661cfcaa75603f571c124e87040"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.035654 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerStarted","Data":"822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.037633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"83a885d2-eea8-4a2c-83d7-a0a945597421","Type":"ContainerStarted","Data":"09bc8394d067bb5fd528392b79d3eaa166fec5cc92216d0b49a7740518aac92b"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.038900 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" event={"ID":"edd64b8e-d074-4720-924b-2ee99744a69a","Type":"ContainerDied","Data":"0e4dd14b9ec130c6417bc438586f2d04b4699443204ee05dc0a295efd1716461"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.038980 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nl9bz" Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.044455 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlxm7" event={"ID":"6bdf507c-05be-4df8-8c33-85f15c05237c","Type":"ContainerStarted","Data":"17c8d5a0a909510e78d3b2ccd265cd63171c763219403596d2ced6bb7e7e1fc5"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.045812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"443cf1a9-f7ab-413e-bddf-08978b24fc87","Type":"ContainerStarted","Data":"343938ffaf89146bfeb86e45c007ed5938461b7f03a682f76836bd3992e8f657"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.046952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" event={"ID":"bb34bcb7-4a40-4d5b-a5ca-55571c61b999","Type":"ContainerStarted","Data":"bfbe5ecfd692bb943299c67ac110cd56f385e5ee9a6869d96356e72d12bd625f"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.048075 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" event={"ID":"800e6ad6-526b-4134-b759-b9c0d884e3f5","Type":"ContainerStarted","Data":"5f2c71c89167709768253495919bb6e0ee16866188ddaefc259eedef4c8351c3"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.049368 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"5af9b96f-fb0a-482b-9000-3b76a8c6c07c","Type":"ContainerStarted","Data":"048e01203d93e57fce04999af55500e6ce91cebba746e515d15273a5fe5436d2"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.050828 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63965380-d86f-4abf-9c9c-4d5a25ad6754","Type":"ContainerStarted","Data":"007ad763a3109ce1131c52dd382ffe19601825c4cd763c4db61fd613df2850b0"} Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.097587 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:25:04 crc kubenswrapper[4830]: I1203 22:25:04.103035 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nl9bz"] Dec 03 22:25:05 crc kubenswrapper[4830]: I1203 22:25:05.067709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c212e9c4-4562-48b2-9be8-bf00f52a076a","Type":"ContainerStarted","Data":"116c738bffb3c1cd728858401d261df4c046efaa579f9b1404ddb4482e89a9d7"} Dec 03 22:25:05 crc kubenswrapper[4830]: I1203 22:25:05.070341 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"09564097-60ae-4b1d-bd03-ba8b5a254167","Type":"ContainerStarted","Data":"9c8e892420a179a5220cc05fb392833e041e04687f2fcb1c2262e1e0bb915aa9"} Dec 03 22:25:05 crc kubenswrapper[4830]: E1203 22:25:05.073234 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" Dec 03 22:25:05 crc kubenswrapper[4830]: I1203 22:25:05.074691 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8j27r" event={"ID":"b0d02b19-e65a-4c45-b658-a34c69cdf74e","Type":"ContainerStarted","Data":"327389d49140cc99b88969a3a7966a9fcfb769572f2ef83404d370b435d9a764"} Dec 03 22:25:05 crc kubenswrapper[4830]: I1203 22:25:05.347713 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd64b8e-d074-4720-924b-2ee99744a69a" path="/var/lib/kubelet/pods/edd64b8e-d074-4720-924b-2ee99744a69a/volumes" Dec 03 22:25:06 crc kubenswrapper[4830]: E1203 22:25:06.082639 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:14f37195a4957e3848690d0ffe5422be55f7599b30dfe1ee0f97eb1118a10a51\\\"\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.125725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" event={"ID":"596170cd-57e9-4665-947d-ddb1549a38e0","Type":"ContainerStarted","Data":"466dd58f09e3d421cf91c59637d4b40b8b36b63e49ecaada9bbd407ae254d84d"} Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.127554 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.132207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" event={"ID":"bb34bcb7-4a40-4d5b-a5ca-55571c61b999","Type":"ContainerStarted","Data":"7c834815ceba6ac3d1d9b660decab747a8d00f3b2e6061943535f730e29814d9"} Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.132444 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.136665 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"5af9b96f-fb0a-482b-9000-3b76a8c6c07c","Type":"ContainerStarted","Data":"098683ef860d747325b265d394150fecc7727c80a98be5c57663c5b598212047"} Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.136836 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.156587 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" podStartSLOduration=20.785673822 podStartE2EDuration="26.156562384s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.657891154 +0000 UTC m=+1192.654352503" lastFinishedPulling="2025-12-03 22:25:09.028779716 +0000 UTC m=+1198.025241065" observedRunningTime="2025-12-03 22:25:10.152967116 +0000 UTC m=+1199.149428475" watchObservedRunningTime="2025-12-03 22:25:10.156562384 +0000 UTC m=+1199.153023743" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.166862 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.193714 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=20.867587783 podStartE2EDuration="26.19368958s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.702497344 +0000 UTC m=+1192.698958693" lastFinishedPulling="2025-12-03 22:25:09.028599141 +0000 UTC m=+1198.025060490" observedRunningTime="2025-12-03 22:25:10.179754789 +0000 UTC m=+1199.176216138" watchObservedRunningTime="2025-12-03 22:25:10.19368958 +0000 UTC m=+1199.190150939" Dec 03 22:25:10 crc kubenswrapper[4830]: I1203 22:25:10.289877 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-tbz5t" podStartSLOduration=21.089064387 podStartE2EDuration="26.289846349s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.951553373 +0000 UTC m=+1192.948014732" lastFinishedPulling="2025-12-03 22:25:09.152335345 +0000 UTC m=+1198.148796694" observedRunningTime="2025-12-03 22:25:10.213098261 +0000 UTC m=+1199.209559620" watchObservedRunningTime="2025-12-03 22:25:10.289846349 +0000 UTC m=+1199.286307698" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.145814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c212e9c4-4562-48b2-9be8-bf00f52a076a","Type":"ContainerStarted","Data":"e92a8c3ccda0e091ff5a5a0d5cb769722210b378d80a665e5c9e9997bbf9b5fb"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.151024 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63965380-d86f-4abf-9c9c-4d5a25ad6754","Type":"ContainerStarted","Data":"75a9981810ba2da55adda3be6a5b336247274ba36656d0479a762b254c9da696"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.152804 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"83a885d2-eea8-4a2c-83d7-a0a945597421","Type":"ContainerStarted","Data":"8aad73b00c8c80312b4fc5bf859a5d42fe25b126b198356ba55c89929527e6b6"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.152935 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.154199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" event={"ID":"2d6f2070-f2d3-47d2-b43f-dfdaed23e03b","Type":"ContainerStarted","Data":"94999ef0fdf83bcf5abed97e7bfbbc842ee19b30c065205d1c5b2c373c4a42c2"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.154315 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.155391 4830 generic.go:334] "Generic (PLEG): container finished" podID="b0d02b19-e65a-4c45-b658-a34c69cdf74e" containerID="5f7ef000a8095c8257209a51a533533c5dbaa14e43eecae394927eef6d6d93b1" exitCode=0 Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.155441 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8j27r" event={"ID":"b0d02b19-e65a-4c45-b658-a34c69cdf74e","Type":"ContainerDied","Data":"5f7ef000a8095c8257209a51a533533c5dbaa14e43eecae394927eef6d6d93b1"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.156777 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlxm7" event={"ID":"6bdf507c-05be-4df8-8c33-85f15c05237c","Type":"ContainerStarted","Data":"63b68bdd5656e0f531d8465257890c9f59501612aa62bca6cc5c88183cf6e748"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.157264 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nlxm7" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.159095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"443cf1a9-f7ab-413e-bddf-08978b24fc87","Type":"ContainerStarted","Data":"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.159716 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.164005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" event={"ID":"de378972-d74f-44fe-a727-19bde47f0cbe","Type":"ContainerStarted","Data":"0c0608b8b142d76bfb0a2931573085c1cb0f4628656ad3468c60727821951236"} Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.197411 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=22.037853271 podStartE2EDuration="27.197392025s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:04.023145611 +0000 UTC m=+1193.019606970" lastFinishedPulling="2025-12-03 22:25:09.182684375 +0000 UTC m=+1198.179145724" observedRunningTime="2025-12-03 22:25:11.17601832 +0000 UTC m=+1200.172479659" watchObservedRunningTime="2025-12-03 22:25:11.197392025 +0000 UTC m=+1200.193853374" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.198870 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=30.281545102 podStartE2EDuration="36.198864326s" podCreationTimestamp="2025-12-03 22:24:35 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.206480411 +0000 UTC m=+1192.202941760" lastFinishedPulling="2025-12-03 22:25:09.123799635 +0000 UTC m=+1198.120260984" observedRunningTime="2025-12-03 22:25:11.193878749 +0000 UTC m=+1200.190340128" watchObservedRunningTime="2025-12-03 22:25:11.198864326 +0000 UTC m=+1200.195325675" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.246830 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nlxm7" podStartSLOduration=26.77443092 podStartE2EDuration="32.246811947s" podCreationTimestamp="2025-12-03 22:24:39 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.700232742 +0000 UTC m=+1192.696694091" lastFinishedPulling="2025-12-03 22:25:09.172613759 +0000 UTC m=+1198.169075118" observedRunningTime="2025-12-03 22:25:11.227644663 +0000 UTC m=+1200.224106012" watchObservedRunningTime="2025-12-03 22:25:11.246811947 +0000 UTC m=+1200.243273296" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.255879 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" podStartSLOduration=21.772092026 podStartE2EDuration="27.255862084s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.699529923 +0000 UTC m=+1192.695991272" lastFinishedPulling="2025-12-03 22:25:09.183299981 +0000 UTC m=+1198.179761330" observedRunningTime="2025-12-03 22:25:11.251607067 +0000 UTC m=+1200.248068456" watchObservedRunningTime="2025-12-03 22:25:11.255862084 +0000 UTC m=+1200.252323423" Dec 03 22:25:11 crc kubenswrapper[4830]: I1203 22:25:11.278870 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" podStartSLOduration=21.778861401 podStartE2EDuration="27.278849323s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.65810608 +0000 UTC m=+1192.654567429" lastFinishedPulling="2025-12-03 22:25:09.158094002 +0000 UTC m=+1198.154555351" observedRunningTime="2025-12-03 22:25:11.265365254 +0000 UTC m=+1200.261826603" watchObservedRunningTime="2025-12-03 22:25:11.278849323 +0000 UTC m=+1200.275310692" Dec 03 22:25:12 crc kubenswrapper[4830]: I1203 22:25:12.173757 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerStarted","Data":"3e0f9ecf29bdb81ba42134b6cbadcbf1f7b6307b02fe467611e6883fe907b3d7"} Dec 03 22:25:12 crc kubenswrapper[4830]: I1203 22:25:12.177413 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8j27r" event={"ID":"b0d02b19-e65a-4c45-b658-a34c69cdf74e","Type":"ContainerStarted","Data":"6be400681dc5378b4e196aa80b774981d290aa47dee6e5313a4066a5d76efbe2"} Dec 03 22:25:12 crc kubenswrapper[4830]: I1203 22:25:12.181886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"deb3672e-3fb5-4549-ae27-6f7402c1e3d8","Type":"ContainerStarted","Data":"fc6bd4a6f5b6aed6cd97f65ae5bcfb1d018a8056e3bbde05e06a0e8d635e9f24"} Dec 03 22:25:12 crc kubenswrapper[4830]: I1203 22:25:12.183529 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:25:12 crc kubenswrapper[4830]: I1203 22:25:12.226936 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.998951945 podStartE2EDuration="39.226917477s" podCreationTimestamp="2025-12-03 22:24:33 +0000 UTC" firstStartedPulling="2025-12-03 22:24:34.587071792 +0000 UTC m=+1163.583533141" lastFinishedPulling="2025-12-03 22:25:11.815037324 +0000 UTC m=+1200.811498673" observedRunningTime="2025-12-03 22:25:12.21717304 +0000 UTC m=+1201.213634389" watchObservedRunningTime="2025-12-03 22:25:12.226917477 +0000 UTC m=+1201.223378826" Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.209411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8j27r" event={"ID":"b0d02b19-e65a-4c45-b658-a34c69cdf74e","Type":"ContainerStarted","Data":"07f09e8a2814d71a9e878536ed6e196b2bb236ef55e512ac75d1a9b6a7c615b0"} Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.210412 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.210498 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.213725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cd27b153-5334-4329-91de-0e6941ae9e97","Type":"ContainerStarted","Data":"447223c521548f3ec3fba5b7b7478a8c65ec5a7766bb563aa1be8ae3ccb5214a"} Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.235425 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8j27r" podStartSLOduration=29.139293894 podStartE2EDuration="34.235393643s" podCreationTimestamp="2025-12-03 22:24:39 +0000 UTC" firstStartedPulling="2025-12-03 22:25:04.02859265 +0000 UTC m=+1193.025053989" lastFinishedPulling="2025-12-03 22:25:09.124692389 +0000 UTC m=+1198.121153738" observedRunningTime="2025-12-03 22:25:13.228488864 +0000 UTC m=+1202.224950223" watchObservedRunningTime="2025-12-03 22:25:13.235393643 +0000 UTC m=+1202.231855032" Dec 03 22:25:13 crc kubenswrapper[4830]: I1203 22:25:13.909049 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 22:25:15 crc kubenswrapper[4830]: I1203 22:25:15.668318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.249402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63965380-d86f-4abf-9c9c-4d5a25ad6754","Type":"ContainerStarted","Data":"03c8a5b8c5e54a73d701530ca6b8e4d7cd1ccef98ab2cc6894f99ba02b64dbc9"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.256072 4830 generic.go:334] "Generic (PLEG): container finished" podID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerID="63e86327eb7b08014e3020d8173d60db0f6e1eb7f92fe3346a764f2ff1920cea" exitCode=0 Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.256296 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" event={"ID":"1a6ed877-b9b1-4c85-b587-f98928d02441","Type":"ContainerDied","Data":"63e86327eb7b08014e3020d8173d60db0f6e1eb7f92fe3346a764f2ff1920cea"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.258715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"85bd20b1-76d6-4238-be14-1c5891d6bbd8","Type":"ContainerStarted","Data":"7a4bad92a80dfe5dfcf1317d1c1538e8cc4ef684cb8065cfee47595a3814f4f0"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.265301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c212e9c4-4562-48b2-9be8-bf00f52a076a","Type":"ContainerStarted","Data":"2c46b990f9e816b8aaf99631aaf336875c12a0beb7f1759d0a9edc123c3743cd"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.267672 4830 generic.go:334] "Generic (PLEG): container finished" podID="303014fe-7aad-489f-9fd9-51d7f373325e" containerID="bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e" exitCode=0 Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.267776 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" event={"ID":"303014fe-7aad-489f-9fd9-51d7f373325e","Type":"ContainerDied","Data":"bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.271190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" event={"ID":"800e6ad6-526b-4134-b759-b9c0d884e3f5","Type":"ContainerStarted","Data":"38784112d61cd41a0cf8a135426e9ee28416081df6f9589308a1b4095e7a2b37"} Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.271700 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.288612 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.706108887 podStartE2EDuration="38.288593604s" podCreationTimestamp="2025-12-03 22:24:39 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.35491913 +0000 UTC m=+1192.351380499" lastFinishedPulling="2025-12-03 22:25:15.937403827 +0000 UTC m=+1204.933865216" observedRunningTime="2025-12-03 22:25:17.283380152 +0000 UTC m=+1206.279841511" watchObservedRunningTime="2025-12-03 22:25:17.288593604 +0000 UTC m=+1206.285054973" Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.288863 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.362418 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.470093847 podStartE2EDuration="34.362398712s" podCreationTimestamp="2025-12-03 22:24:43 +0000 UTC" firstStartedPulling="2025-12-03 22:25:04.030497843 +0000 UTC m=+1193.026959192" lastFinishedPulling="2025-12-03 22:25:15.922802678 +0000 UTC m=+1204.919264057" observedRunningTime="2025-12-03 22:25:17.355045421 +0000 UTC m=+1206.351506780" watchObservedRunningTime="2025-12-03 22:25:17.362398712 +0000 UTC m=+1206.358860061" Dec 03 22:25:17 crc kubenswrapper[4830]: I1203 22:25:17.413266 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-gbw74" podStartSLOduration=21.153809598 podStartE2EDuration="33.413239952s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.657547615 +0000 UTC m=+1192.654008964" lastFinishedPulling="2025-12-03 22:25:15.916977959 +0000 UTC m=+1204.913439318" observedRunningTime="2025-12-03 22:25:17.40109064 +0000 UTC m=+1206.397551999" watchObservedRunningTime="2025-12-03 22:25:17.413239952 +0000 UTC m=+1206.409701311" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.185190 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.274584 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.280605 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.329653 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.589545 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.625722 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.628081 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.631740 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.641430 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.805941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.806050 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.806109 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.806137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sc7d\" (UniqueName: \"kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.841088 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5r2qg"] Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.842161 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.844860 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.863482 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5r2qg"] Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.910644 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.910763 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.910847 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.910870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sc7d\" (UniqueName: \"kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.910943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.911739 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.911788 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.911831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.948810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sc7d\" (UniqueName: \"kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d\") pod \"dnsmasq-dns-7f896c8c65-tk6tj\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:18 crc kubenswrapper[4830]: I1203 22:25:18.953779 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.011997 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovn-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.012243 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcqm\" (UniqueName: \"kubernetes.io/projected/3af15ba1-ae94-416f-afe7-534d88ee8a64-kube-api-access-llcqm\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.012293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af15ba1-ae94-416f-afe7-534d88ee8a64-config\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.012345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.012404 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-combined-ca-bundle\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.012427 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovs-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114268 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-combined-ca-bundle\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114310 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovs-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114378 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovn-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114398 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llcqm\" (UniqueName: \"kubernetes.io/projected/3af15ba1-ae94-416f-afe7-534d88ee8a64-kube-api-access-llcqm\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af15ba1-ae94-416f-afe7-534d88ee8a64-config\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.114464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.115038 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovn-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.115092 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3af15ba1-ae94-416f-afe7-534d88ee8a64-ovs-rundir\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.115824 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af15ba1-ae94-416f-afe7-534d88ee8a64-config\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.121361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-combined-ca-bundle\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.128773 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3af15ba1-ae94-416f-afe7-534d88ee8a64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.135110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcqm\" (UniqueName: \"kubernetes.io/projected/3af15ba1-ae94-416f-afe7-534d88ee8a64-kube-api-access-llcqm\") pod \"ovn-controller-metrics-5r2qg\" (UID: \"3af15ba1-ae94-416f-afe7-534d88ee8a64\") " pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.155533 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5r2qg" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.234373 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.271148 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.277045 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.286296 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.320418 4830 generic.go:334] "Generic (PLEG): container finished" podID="e38be206-c963-42f9-834d-a9263b18cbed" containerID="3e0f9ecf29bdb81ba42134b6cbadcbf1f7b6307b02fe467611e6883fe907b3d7" exitCode=0 Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.320691 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerDied","Data":"3e0f9ecf29bdb81ba42134b6cbadcbf1f7b6307b02fe467611e6883fe907b3d7"} Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.327120 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.421095 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.421157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.421180 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.421211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7v8\" (UniqueName: \"kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.421566 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.525022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.525400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.525421 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.525449 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7v8\" (UniqueName: \"kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.525467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.526307 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.527101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.527229 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.528598 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.548884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7v8\" (UniqueName: \"kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8\") pod \"dnsmasq-dns-86db49b7ff-hk62l\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.618918 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.661937 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 22:25:19 crc kubenswrapper[4830]: I1203 22:25:19.734073 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.175054 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5r2qg"] Dec 03 22:25:20 crc kubenswrapper[4830]: W1203 22:25:20.175864 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3af15ba1_ae94_416f_afe7_534d88ee8a64.slice/crio-712422c6b4ea17aa873c6daaf8420dc28f37b6372e5be958ce5c6952c17a5f1d WatchSource:0}: Error finding container 712422c6b4ea17aa873c6daaf8420dc28f37b6372e5be958ce5c6952c17a5f1d: Status 404 returned error can't find the container with id 712422c6b4ea17aa873c6daaf8420dc28f37b6372e5be958ce5c6952c17a5f1d Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.302089 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:20 crc kubenswrapper[4830]: W1203 22:25:20.308604 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca56f30d_5e09_45f4_be07_c1e536522acc.slice/crio-8c706f286c54b98c0e9a83b32828f9dc70b1632b3fa181180255c3f0a641367f WatchSource:0}: Error finding container 8c706f286c54b98c0e9a83b32828f9dc70b1632b3fa181180255c3f0a641367f: Status 404 returned error can't find the container with id 8c706f286c54b98c0e9a83b32828f9dc70b1632b3fa181180255c3f0a641367f Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.313363 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:20 crc kubenswrapper[4830]: W1203 22:25:20.323641 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8958b2b0_be2d_4030_90cf_5e4335a17a50.slice/crio-4cf7234a3abbec2013eb904ad3a315928617c0fbd64584be533bcf550e997bda WatchSource:0}: Error finding container 4cf7234a3abbec2013eb904ad3a315928617c0fbd64584be533bcf550e997bda: Status 404 returned error can't find the container with id 4cf7234a3abbec2013eb904ad3a315928617c0fbd64584be533bcf550e997bda Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.335949 4830 generic.go:334] "Generic (PLEG): container finished" podID="cd27b153-5334-4329-91de-0e6941ae9e97" containerID="447223c521548f3ec3fba5b7b7478a8c65ec5a7766bb563aa1be8ae3ccb5214a" exitCode=0 Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.335987 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cd27b153-5334-4329-91de-0e6941ae9e97","Type":"ContainerDied","Data":"447223c521548f3ec3fba5b7b7478a8c65ec5a7766bb563aa1be8ae3ccb5214a"} Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.338567 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5r2qg" event={"ID":"3af15ba1-ae94-416f-afe7-534d88ee8a64","Type":"ContainerStarted","Data":"712422c6b4ea17aa873c6daaf8420dc28f37b6372e5be958ce5c6952c17a5f1d"} Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.343267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" event={"ID":"ca56f30d-5e09-45f4-be07-c1e536522acc","Type":"ContainerStarted","Data":"8c706f286c54b98c0e9a83b32828f9dc70b1632b3fa181180255c3f0a641367f"} Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.344210 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.405742 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.556582 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.558860 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.561637 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.562544 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.562833 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.563078 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-sqgwp" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.563339 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.649715 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.649980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516cc148-c477-46f0-bc3e-475ad6003486-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.650002 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-config\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.650068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.650109 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-scripts\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.650137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.650162 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4rs\" (UniqueName: \"kubernetes.io/projected/516cc148-c477-46f0-bc3e-475ad6003486-kube-api-access-qx4rs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751694 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-scripts\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751776 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751823 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4rs\" (UniqueName: \"kubernetes.io/projected/516cc148-c477-46f0-bc3e-475ad6003486-kube-api-access-qx4rs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751861 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516cc148-c477-46f0-bc3e-475ad6003486-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.751930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-config\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.752015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.752559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516cc148-c477-46f0-bc3e-475ad6003486-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.752669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-scripts\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.753083 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516cc148-c477-46f0-bc3e-475ad6003486-config\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.756972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.757981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.758064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/516cc148-c477-46f0-bc3e-475ad6003486-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.769865 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4rs\" (UniqueName: \"kubernetes.io/projected/516cc148-c477-46f0-bc3e-475ad6003486-kube-api-access-qx4rs\") pod \"ovn-northd-0\" (UID: \"516cc148-c477-46f0-bc3e-475ad6003486\") " pod="openstack/ovn-northd-0" Dec 03 22:25:20 crc kubenswrapper[4830]: I1203 22:25:20.894559 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.352803 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" event={"ID":"303014fe-7aad-489f-9fd9-51d7f373325e","Type":"ContainerStarted","Data":"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b"} Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.354461 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" event={"ID":"8958b2b0-be2d-4030-90cf-5e4335a17a50","Type":"ContainerStarted","Data":"4cf7234a3abbec2013eb904ad3a315928617c0fbd64584be533bcf550e997bda"} Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.356411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21e1ac03-6466-4663-bff2-68ff2cc7801d","Type":"ContainerStarted","Data":"b0bf0c742739864479afa2926be2bfe9307e5ee501f769cf9c8ba46514691497"} Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.357876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" event={"ID":"1a6ed877-b9b1-4c85-b587-f98928d02441","Type":"ContainerStarted","Data":"e3620866a99ad9e6962c87781cfbd5636d1ad5e8fbbdf1a3afaf7a1773379a60"} Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.359076 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerStarted","Data":"fea7ef4434389728b9f81d8aa6ccd5a9ac43c0d7acae08c6c3292151a1167cff"} Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.374843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerStarted","Data":"b770f5ff20096e432c2f76473a1a5de1ea08c80b6cf3d629c1285e3a33deff49"} Dec 03 22:25:21 crc kubenswrapper[4830]: W1203 22:25:21.486972 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod516cc148_c477_46f0_bc3e_475ad6003486.slice/crio-e9c305f6edee34c1f2157e8ef51b01e983b36ae6399baffb8b2ca45a5969ed51 WatchSource:0}: Error finding container e9c305f6edee34c1f2157e8ef51b01e983b36ae6399baffb8b2ca45a5969ed51: Status 404 returned error can't find the container with id e9c305f6edee34c1f2157e8ef51b01e983b36ae6399baffb8b2ca45a5969ed51 Dec 03 22:25:21 crc kubenswrapper[4830]: I1203 22:25:21.492925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.399917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"09564097-60ae-4b1d-bd03-ba8b5a254167","Type":"ContainerStarted","Data":"afb2944d3a2849c2e839784ea6c7dc90465dd9ac8953b8ca463b28acfcb06969"} Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.400341 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.403586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5r2qg" event={"ID":"3af15ba1-ae94-416f-afe7-534d88ee8a64","Type":"ContainerStarted","Data":"3f3c33f8156b1d9b1cdd9c407e777b911cbaf502bc61f735d9807203fbbc1995"} Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.406430 4830 generic.go:334] "Generic (PLEG): container finished" podID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerID="6da1c4b0907b956e1cfd4d00247a7e97acb548d1bde971552bb69313b34c0777" exitCode=0 Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.406575 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" event={"ID":"8958b2b0-be2d-4030-90cf-5e4335a17a50","Type":"ContainerDied","Data":"6da1c4b0907b956e1cfd4d00247a7e97acb548d1bde971552bb69313b34c0777"} Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.407737 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"516cc148-c477-46f0-bc3e-475ad6003486","Type":"ContainerStarted","Data":"e9c305f6edee34c1f2157e8ef51b01e983b36ae6399baffb8b2ca45a5969ed51"} Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.410092 4830 generic.go:334] "Generic (PLEG): container finished" podID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerID="66f4f7ca27f942212327ebed1d62d17fba35017993200cb69edad10deedbd22e" exitCode=0 Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.410192 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="dnsmasq-dns" containerID="cri-o://e3620866a99ad9e6962c87781cfbd5636d1ad5e8fbbdf1a3afaf7a1773379a60" gracePeriod=10 Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.410995 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" event={"ID":"ca56f30d-5e09-45f4-be07-c1e536522acc","Type":"ContainerDied","Data":"66f4f7ca27f942212327ebed1d62d17fba35017993200cb69edad10deedbd22e"} Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.411079 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.411239 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="dnsmasq-dns" containerID="cri-o://9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b" gracePeriod=10 Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.423853 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=-9223371998.430939 podStartE2EDuration="38.423837992s" podCreationTimestamp="2025-12-03 22:24:44 +0000 UTC" firstStartedPulling="2025-12-03 22:25:04.030641946 +0000 UTC m=+1193.027103315" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:22.415463663 +0000 UTC m=+1211.411925012" watchObservedRunningTime="2025-12-03 22:25:22.423837992 +0000 UTC m=+1211.420299341" Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.474129 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" podStartSLOduration=7.346896768 podStartE2EDuration="53.474113937s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="2025-12-03 22:24:29.884009426 +0000 UTC m=+1158.880470825" lastFinishedPulling="2025-12-03 22:25:16.011226625 +0000 UTC m=+1205.007687994" observedRunningTime="2025-12-03 22:25:22.464826423 +0000 UTC m=+1211.461287772" watchObservedRunningTime="2025-12-03 22:25:22.474113937 +0000 UTC m=+1211.470575286" Dec 03 22:25:22 crc kubenswrapper[4830]: I1203 22:25:22.528797 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5r2qg" podStartSLOduration=4.527233719 podStartE2EDuration="4.527233719s" podCreationTimestamp="2025-12-03 22:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:22.524753622 +0000 UTC m=+1211.521214971" watchObservedRunningTime="2025-12-03 22:25:22.527233719 +0000 UTC m=+1211.523695068" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.252806 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.298959 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" podStartSLOduration=8.858659213 podStartE2EDuration="55.298941042s" podCreationTimestamp="2025-12-03 22:24:28 +0000 UTC" firstStartedPulling="2025-12-03 22:24:29.572706735 +0000 UTC m=+1158.569168084" lastFinishedPulling="2025-12-03 22:25:16.012988564 +0000 UTC m=+1205.009449913" observedRunningTime="2025-12-03 22:25:22.598798647 +0000 UTC m=+1211.595259996" watchObservedRunningTime="2025-12-03 22:25:23.298941042 +0000 UTC m=+1212.295402391" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.424142 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config\") pod \"303014fe-7aad-489f-9fd9-51d7f373325e\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.424217 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc\") pod \"303014fe-7aad-489f-9fd9-51d7f373325e\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.424289 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gq5w\" (UniqueName: \"kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w\") pod \"303014fe-7aad-489f-9fd9-51d7f373325e\" (UID: \"303014fe-7aad-489f-9fd9-51d7f373325e\") " Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.432215 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w" (OuterVolumeSpecName: "kube-api-access-2gq5w") pod "303014fe-7aad-489f-9fd9-51d7f373325e" (UID: "303014fe-7aad-489f-9fd9-51d7f373325e"). InnerVolumeSpecName "kube-api-access-2gq5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.441693 4830 generic.go:334] "Generic (PLEG): container finished" podID="303014fe-7aad-489f-9fd9-51d7f373325e" containerID="9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b" exitCode=0 Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.441777 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" event={"ID":"303014fe-7aad-489f-9fd9-51d7f373325e","Type":"ContainerDied","Data":"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.441807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" event={"ID":"303014fe-7aad-489f-9fd9-51d7f373325e","Type":"ContainerDied","Data":"b00fbeed1ab06dfeae0b8e6a69ba661cae89803401f47a9ec428d70d45170753"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.441825 4830 scope.go:117] "RemoveContainer" containerID="9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.441963 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vjkwl" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.481864 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" event={"ID":"8958b2b0-be2d-4030-90cf-5e4335a17a50","Type":"ContainerStarted","Data":"a4022806dfc4acf5aec0d01837fa9e5e5b92ff824173d5c55f40775e02437b5f"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.482139 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.484203 4830 generic.go:334] "Generic (PLEG): container finished" podID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerID="e3620866a99ad9e6962c87781cfbd5636d1ad5e8fbbdf1a3afaf7a1773379a60" exitCode=0 Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.484248 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" event={"ID":"1a6ed877-b9b1-4c85-b587-f98928d02441","Type":"ContainerDied","Data":"e3620866a99ad9e6962c87781cfbd5636d1ad5e8fbbdf1a3afaf7a1773379a60"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.489114 4830 generic.go:334] "Generic (PLEG): container finished" podID="85bd20b1-76d6-4238-be14-1c5891d6bbd8" containerID="7a4bad92a80dfe5dfcf1317d1c1538e8cc4ef684cb8065cfee47595a3814f4f0" exitCode=0 Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.489167 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"85bd20b1-76d6-4238-be14-1c5891d6bbd8","Type":"ContainerDied","Data":"7a4bad92a80dfe5dfcf1317d1c1538e8cc4ef684cb8065cfee47595a3814f4f0"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.498102 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config" (OuterVolumeSpecName: "config") pod "303014fe-7aad-489f-9fd9-51d7f373325e" (UID: "303014fe-7aad-489f-9fd9-51d7f373325e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.508625 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" event={"ID":"ca56f30d-5e09-45f4-be07-c1e536522acc","Type":"ContainerStarted","Data":"250b06d665fa887b2a65693338db98a13d2b611250cf63931318d2a2ad8131ce"} Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.508912 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.514794 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" podStartSLOduration=4.514770493 podStartE2EDuration="4.514770493s" podCreationTimestamp="2025-12-03 22:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:23.511592187 +0000 UTC m=+1212.508053556" watchObservedRunningTime="2025-12-03 22:25:23.514770493 +0000 UTC m=+1212.511231842" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.526778 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gq5w\" (UniqueName: \"kubernetes.io/projected/303014fe-7aad-489f-9fd9-51d7f373325e-kube-api-access-2gq5w\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.526824 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.536653 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "303014fe-7aad-489f-9fd9-51d7f373325e" (UID: "303014fe-7aad-489f-9fd9-51d7f373325e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.543565 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" podStartSLOduration=5.54354815 podStartE2EDuration="5.54354815s" podCreationTimestamp="2025-12-03 22:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:23.539993293 +0000 UTC m=+1212.536454662" watchObservedRunningTime="2025-12-03 22:25:23.54354815 +0000 UTC m=+1212.540009499" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.630799 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303014fe-7aad-489f-9fd9-51d7f373325e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.773466 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:25:23 crc kubenswrapper[4830]: I1203 22:25:23.783763 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vjkwl"] Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.250886 4830 scope.go:117] "RemoveContainer" containerID="bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.254806 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.356353 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" path="/var/lib/kubelet/pods/303014fe-7aad-489f-9fd9-51d7f373325e/volumes" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.367345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc\") pod \"1a6ed877-b9b1-4c85-b587-f98928d02441\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.367538 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config\") pod \"1a6ed877-b9b1-4c85-b587-f98928d02441\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.367567 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhj57\" (UniqueName: \"kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57\") pod \"1a6ed877-b9b1-4c85-b587-f98928d02441\" (UID: \"1a6ed877-b9b1-4c85-b587-f98928d02441\") " Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.374618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57" (OuterVolumeSpecName: "kube-api-access-xhj57") pod "1a6ed877-b9b1-4c85-b587-f98928d02441" (UID: "1a6ed877-b9b1-4c85-b587-f98928d02441"). InnerVolumeSpecName "kube-api-access-xhj57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.398767 4830 scope.go:117] "RemoveContainer" containerID="9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b" Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.399183 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b\": container with ID starting with 9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b not found: ID does not exist" containerID="9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.399259 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b"} err="failed to get container status \"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b\": rpc error: code = NotFound desc = could not find container \"9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b\": container with ID starting with 9bd553b58306ce26ac09df9e4b6e0cc62d274e5942a2b776eed163df48be5c2b not found: ID does not exist" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.399280 4830 scope.go:117] "RemoveContainer" containerID="bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e" Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.399612 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e\": container with ID starting with bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e not found: ID does not exist" containerID="bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.399677 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e"} err="failed to get container status \"bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e\": rpc error: code = NotFound desc = could not find container \"bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e\": container with ID starting with bba2dce678af090087bff358d7c88977ae8544ecd8f4e75076a1ad6727f8e63e not found: ID does not exist" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.450974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a6ed877-b9b1-4c85-b587-f98928d02441" (UID: "1a6ed877-b9b1-4c85-b587-f98928d02441"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.458088 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config" (OuterVolumeSpecName: "config") pod "1a6ed877-b9b1-4c85-b587-f98928d02441" (UID: "1a6ed877-b9b1-4c85-b587-f98928d02441"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.469909 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.469949 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhj57\" (UniqueName: \"kubernetes.io/projected/1a6ed877-b9b1-4c85-b587-f98928d02441-kube-api-access-xhj57\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.469962 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a6ed877-b9b1-4c85-b587-f98928d02441-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.526876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" event={"ID":"1a6ed877-b9b1-4c85-b587-f98928d02441","Type":"ContainerDied","Data":"dd8d6f31d5b69c7906e07db851c527783bfcd68d330cff2ff49e2d8a02fcbec0"} Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.526917 4830 scope.go:117] "RemoveContainer" containerID="e3620866a99ad9e6962c87781cfbd5636d1ad5e8fbbdf1a3afaf7a1773379a60" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.527019 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lj9s" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.557608 4830 scope.go:117] "RemoveContainer" containerID="63e86327eb7b08014e3020d8173d60db0f6e1eb7f92fe3346a764f2ff1920cea" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.566024 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.589676 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lj9s"] Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.809950 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.810125 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="dnsmasq-dns" containerID="cri-o://250b06d665fa887b2a65693338db98a13d2b611250cf63931318d2a2ad8131ce" gracePeriod=10 Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.873567 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.873941 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="init" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.873952 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="init" Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.873965 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="init" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.873971 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="init" Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.873996 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.874002 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: E1203 22:25:25.874014 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.874020 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.874169 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.874189 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="303014fe-7aad-489f-9fd9-51d7f373325e" containerName="dnsmasq-dns" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.875142 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.885542 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.980141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.980482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.980569 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmrw\" (UniqueName: \"kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.980598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:25 crc kubenswrapper[4830]: I1203 22:25:25.980705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.082832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmrw\" (UniqueName: \"kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.082882 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.082950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.083003 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.083030 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.083941 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.085000 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.085714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.086551 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.090712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.128759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmrw\" (UniqueName: \"kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw\") pod \"dnsmasq-dns-698758b865-xzbv8\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.136988 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.232383 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.546499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"516cc148-c477-46f0-bc3e-475ad6003486","Type":"ContainerStarted","Data":"856520ec84510bf05367130f567779e205d19b48df7ea8ec842fa0adfdb36d07"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.546830 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.546864 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"516cc148-c477-46f0-bc3e-475ad6003486","Type":"ContainerStarted","Data":"b18fdee1613ffdcf730cdc2f687c8000470bfd351b41555a6ca4d2cf85d8b9d6"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.550656 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"85bd20b1-76d6-4238-be14-1c5891d6bbd8","Type":"ContainerStarted","Data":"350b4522ec23042c58d55b1bcaaf999542277a8bfdc812bcd2f8480fee71f48c"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.552487 4830 generic.go:334] "Generic (PLEG): container finished" podID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerID="250b06d665fa887b2a65693338db98a13d2b611250cf63931318d2a2ad8131ce" exitCode=0 Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.552555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" event={"ID":"ca56f30d-5e09-45f4-be07-c1e536522acc","Type":"ContainerDied","Data":"250b06d665fa887b2a65693338db98a13d2b611250cf63931318d2a2ad8131ce"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.554422 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cd27b153-5334-4329-91de-0e6941ae9e97","Type":"ContainerStarted","Data":"e057beab3ee024870983c8897df0f1f553854040a667b7d961f36ee95c835aa5"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.555821 4830 generic.go:334] "Generic (PLEG): container finished" podID="21e1ac03-6466-4663-bff2-68ff2cc7801d" containerID="b0bf0c742739864479afa2926be2bfe9307e5ee501f769cf9c8ba46514691497" exitCode=0 Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.555854 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21e1ac03-6466-4663-bff2-68ff2cc7801d","Type":"ContainerDied","Data":"b0bf0c742739864479afa2926be2bfe9307e5ee501f769cf9c8ba46514691497"} Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.569788 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.811299017 podStartE2EDuration="6.569766749s" podCreationTimestamp="2025-12-03 22:25:20 +0000 UTC" firstStartedPulling="2025-12-03 22:25:21.492796954 +0000 UTC m=+1210.489258293" lastFinishedPulling="2025-12-03 22:25:25.251264676 +0000 UTC m=+1214.247726025" observedRunningTime="2025-12-03 22:25:26.569559513 +0000 UTC m=+1215.566020862" watchObservedRunningTime="2025-12-03 22:25:26.569766749 +0000 UTC m=+1215.566228198" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.617227 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=15.265058035 podStartE2EDuration="56.614672537s" podCreationTimestamp="2025-12-03 22:24:30 +0000 UTC" firstStartedPulling="2025-12-03 22:24:34.658888689 +0000 UTC m=+1163.655350038" lastFinishedPulling="2025-12-03 22:25:16.008503191 +0000 UTC m=+1205.004964540" observedRunningTime="2025-12-03 22:25:26.590454645 +0000 UTC m=+1215.586916004" watchObservedRunningTime="2025-12-03 22:25:26.614672537 +0000 UTC m=+1215.611133886" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.681142 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.681190 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.765875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.946562 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.957363 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.959813 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.959970 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2w45c" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.960030 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.964478 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.978780 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:25:26 crc kubenswrapper[4830]: I1203 22:25:26.990007 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.103592 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sc7d\" (UniqueName: \"kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d\") pod \"ca56f30d-5e09-45f4-be07-c1e536522acc\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.103937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb\") pod \"ca56f30d-5e09-45f4-be07-c1e536522acc\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.103988 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc\") pod \"ca56f30d-5e09-45f4-be07-c1e536522acc\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104050 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config\") pod \"ca56f30d-5e09-45f4-be07-c1e536522acc\" (UID: \"ca56f30d-5e09-45f4-be07-c1e536522acc\") " Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104272 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104356 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104378 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-lock\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sxm\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-kube-api-access-42sxm\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.104489 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-cache\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.119139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d" (OuterVolumeSpecName: "kube-api-access-4sc7d") pod "ca56f30d-5e09-45f4-be07-c1e536522acc" (UID: "ca56f30d-5e09-45f4-be07-c1e536522acc"). InnerVolumeSpecName "kube-api-access-4sc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.162372 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config" (OuterVolumeSpecName: "config") pod "ca56f30d-5e09-45f4-be07-c1e536522acc" (UID: "ca56f30d-5e09-45f4-be07-c1e536522acc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.163182 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca56f30d-5e09-45f4-be07-c1e536522acc" (UID: "ca56f30d-5e09-45f4-be07-c1e536522acc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.167590 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca56f30d-5e09-45f4-be07-c1e536522acc" (UID: "ca56f30d-5e09-45f4-be07-c1e536522acc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.205878 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-cache\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.205937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206018 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206047 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-lock\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sxm\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-kube-api-access-42sxm\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206204 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sc7d\" (UniqueName: \"kubernetes.io/projected/ca56f30d-5e09-45f4-be07-c1e536522acc-kube-api-access-4sc7d\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206221 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206233 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.206243 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca56f30d-5e09-45f4-be07-c1e536522acc-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.207979 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-cache\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.208382 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.208400 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.208440 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:27.708426502 +0000 UTC m=+1216.704887851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.209003 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-lock\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.215298 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.215334 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5bd20f1bc71c6684b2e5f72336f30ec2055b08bf5a54656852d5c6c53589cf11/globalmount\"" pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.231523 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sxm\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-kube-api-access-42sxm\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.260297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d458b2ec-dd5e-4599-94f6-b1b66b058f03\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.354399 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6ed877-b9b1-4c85-b587-f98928d02441" path="/var/lib/kubelet/pods/1a6ed877-b9b1-4c85-b587-f98928d02441/volumes" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.570876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" event={"ID":"ca56f30d-5e09-45f4-be07-c1e536522acc","Type":"ContainerDied","Data":"8c706f286c54b98c0e9a83b32828f9dc70b1632b3fa181180255c3f0a641367f"} Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.570925 4830 scope.go:117] "RemoveContainer" containerID="250b06d665fa887b2a65693338db98a13d2b611250cf63931318d2a2ad8131ce" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.570930 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-tk6tj" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.573931 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerID="69867339cd6a8969d75a977c4cf3c9124d356d52614a9c93606e04bceeb178fc" exitCode=0 Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.573990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xzbv8" event={"ID":"ae971dc1-0fb1-482a-a05a-2aa2adb99a53","Type":"ContainerDied","Data":"69867339cd6a8969d75a977c4cf3c9124d356d52614a9c93606e04bceeb178fc"} Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.574051 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xzbv8" event={"ID":"ae971dc1-0fb1-482a-a05a-2aa2adb99a53","Type":"ContainerStarted","Data":"549814468a3a5d0cb5b1ee9603d5aff88fde36f23ed4a615d05e523516def7f2"} Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.580318 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21e1ac03-6466-4663-bff2-68ff2cc7801d","Type":"ContainerStarted","Data":"0f4547d59b0084c61af6a2c7deeda6f8ba30e0a7e82436f62f57658a09d63e21"} Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.607680 4830 scope.go:117] "RemoveContainer" containerID="66f4f7ca27f942212327ebed1d62d17fba35017993200cb69edad10deedbd22e" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.630353 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.643921 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-tk6tj"] Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.644228 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371981.210579 podStartE2EDuration="55.644197599s" podCreationTimestamp="2025-12-03 22:24:32 +0000 UTC" firstStartedPulling="2025-12-03 22:24:34.500416862 +0000 UTC m=+1163.496878211" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:27.625457816 +0000 UTC m=+1216.621919165" watchObservedRunningTime="2025-12-03 22:25:27.644197599 +0000 UTC m=+1216.640658958" Dec 03 22:25:27 crc kubenswrapper[4830]: I1203 22:25:27.714229 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.715181 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.715213 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:27 crc kubenswrapper[4830]: E1203 22:25:27.715247 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:28.715234651 +0000 UTC m=+1217.711696010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.019640 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fqk8x"] Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.020214 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="dnsmasq-dns" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.020228 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="dnsmasq-dns" Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.020239 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="init" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.020263 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="init" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.020463 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" containerName="dnsmasq-dns" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.022080 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.035921 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.036151 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.036304 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.057728 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fqk8x"] Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.061192 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-j8p6m ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-fqk8x" podUID="40d4cfb8-be84-47ab-b6af-721f583c935e" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.069391 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fqk8x"] Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.078265 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wvw9q"] Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.079432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.094440 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wvw9q"] Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.123335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124501 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124768 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124847 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8p6m\" (UniqueName: \"kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124891 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.124941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8p6m\" (UniqueName: \"kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229620 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229645 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229675 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229722 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qsb6\" (UniqueName: \"kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229798 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229825 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229880 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229928 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.229962 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.230850 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.231254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.234323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.238434 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.244630 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.249265 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.264120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8p6m\" (UniqueName: \"kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m\") pod \"swift-ring-rebalance-fqk8x\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331225 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331281 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331346 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.331466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qsb6\" (UniqueName: \"kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.332421 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.332454 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.332876 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.335014 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.336444 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.337695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.361168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qsb6\" (UniqueName: \"kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6\") pod \"swift-ring-rebalance-wvw9q\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.414605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.591589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xzbv8" event={"ID":"ae971dc1-0fb1-482a-a05a-2aa2adb99a53","Type":"ContainerStarted","Data":"9dfdfaad9bfdf2d13d69b094f175853adcce7f24de017b853eb1ce418bd52a17"} Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.591752 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.594409 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.594474 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cd27b153-5334-4329-91de-0e6941ae9e97","Type":"ContainerStarted","Data":"8a8a5af307c50132607b6bc39acaba476cff5d01a5f2e8439a2f0932644bd869"} Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.594800 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.597796 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.605793 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.616784 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podStartSLOduration=3.616764822 podStartE2EDuration="3.616764822s" podCreationTimestamp="2025-12-03 22:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:28.6108157 +0000 UTC m=+1217.607277059" watchObservedRunningTime="2025-12-03 22:25:28.616764822 +0000 UTC m=+1217.613226171" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.644700 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=31.608948658 podStartE2EDuration="53.644683656s" podCreationTimestamp="2025-12-03 22:24:35 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.21523346 +0000 UTC m=+1192.211694809" lastFinishedPulling="2025-12-03 22:25:25.250968458 +0000 UTC m=+1214.247429807" observedRunningTime="2025-12-03 22:25:28.635850645 +0000 UTC m=+1217.632311984" watchObservedRunningTime="2025-12-03 22:25:28.644683656 +0000 UTC m=+1217.641145005" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.736897 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737009 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737062 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737096 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737155 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737180 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8p6m\" (UniqueName: \"kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737199 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf\") pod \"40d4cfb8-be84-47ab-b6af-721f583c935e\" (UID: \"40d4cfb8-be84-47ab-b6af-721f583c935e\") " Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.737587 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.738279 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts" (OuterVolumeSpecName: "scripts") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.738486 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.739394 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.740272 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.740300 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:28 crc kubenswrapper[4830]: E1203 22:25:28.740338 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:30.740322151 +0000 UTC m=+1219.736783490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.742750 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.744892 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.744924 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.749828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m" (OuterVolumeSpecName: "kube-api-access-j8p6m") pod "40d4cfb8-be84-47ab-b6af-721f583c935e" (UID: "40d4cfb8-be84-47ab-b6af-721f583c935e"). InnerVolumeSpecName "kube-api-access-j8p6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.839650 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.839866 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8p6m\" (UniqueName: \"kubernetes.io/projected/40d4cfb8-be84-47ab-b6af-721f583c935e-kube-api-access-j8p6m\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.839927 4830 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.840013 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.840077 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40d4cfb8-be84-47ab-b6af-721f583c935e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.840139 4830 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40d4cfb8-be84-47ab-b6af-721f583c935e-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:28 crc kubenswrapper[4830]: I1203 22:25:28.840196 4830 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40d4cfb8-be84-47ab-b6af-721f583c935e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:29 crc kubenswrapper[4830]: I1203 22:25:29.347935 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca56f30d-5e09-45f4-be07-c1e536522acc" path="/var/lib/kubelet/pods/ca56f30d-5e09-45f4-be07-c1e536522acc/volumes" Dec 03 22:25:29 crc kubenswrapper[4830]: I1203 22:25:29.601822 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqk8x" Dec 03 22:25:29 crc kubenswrapper[4830]: I1203 22:25:29.621083 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:29 crc kubenswrapper[4830]: I1203 22:25:29.656740 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fqk8x"] Dec 03 22:25:29 crc kubenswrapper[4830]: I1203 22:25:29.665311 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-fqk8x"] Dec 03 22:25:30 crc kubenswrapper[4830]: I1203 22:25:30.777036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:30 crc kubenswrapper[4830]: E1203 22:25:30.777279 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:30 crc kubenswrapper[4830]: E1203 22:25:30.777327 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:30 crc kubenswrapper[4830]: E1203 22:25:30.777398 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:34.777362902 +0000 UTC m=+1223.773824251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:31 crc kubenswrapper[4830]: I1203 22:25:31.359556 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d4cfb8-be84-47ab-b6af-721f583c935e" path="/var/lib/kubelet/pods/40d4cfb8-be84-47ab-b6af-721f583c935e/volumes" Dec 03 22:25:33 crc kubenswrapper[4830]: I1203 22:25:33.682432 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wvw9q"] Dec 03 22:25:33 crc kubenswrapper[4830]: W1203 22:25:33.693009 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19889054_44cb_47a4_a604_a319f1bd25af.slice/crio-4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395 WatchSource:0}: Error finding container 4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395: Status 404 returned error can't find the container with id 4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395 Dec 03 22:25:33 crc kubenswrapper[4830]: I1203 22:25:33.864242 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 22:25:33 crc kubenswrapper[4830]: I1203 22:25:33.864304 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 22:25:33 crc kubenswrapper[4830]: I1203 22:25:33.920173 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 22:25:33 crc kubenswrapper[4830]: I1203 22:25:33.920238 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.001445 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.515550 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gvmnk" Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.664199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerStarted","Data":"19b361512fa4c6f017b9fc0c350f7985f647adac000fa04cc66280d88048adef"} Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.665646 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wvw9q" event={"ID":"19889054-44cb-47a4-a604-a319f1bd25af","Type":"ContainerStarted","Data":"4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395"} Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.745562 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.748406 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-548665d79b-xntf7" Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.863703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:34 crc kubenswrapper[4830]: E1203 22:25:34.864714 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:34 crc kubenswrapper[4830]: E1203 22:25:34.864744 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:34 crc kubenswrapper[4830]: E1203 22:25:34.864796 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:42.864774198 +0000 UTC m=+1231.861235567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:34 crc kubenswrapper[4830]: I1203 22:25:34.926923 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-qg9t9" Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.113782 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.234711 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.339760 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.340179 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="dnsmasq-dns" containerID="cri-o://a4022806dfc4acf5aec0d01837fa9e5e5b92ff824173d5c55f40775e02437b5f" gracePeriod=10 Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.569105 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.663252 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.755720 4830 generic.go:334] "Generic (PLEG): container finished" podID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerID="a4022806dfc4acf5aec0d01837fa9e5e5b92ff824173d5c55f40775e02437b5f" exitCode=0 Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.755827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" event={"ID":"8958b2b0-be2d-4030-90cf-5e4335a17a50","Type":"ContainerDied","Data":"a4022806dfc4acf5aec0d01837fa9e5e5b92ff824173d5c55f40775e02437b5f"} Dec 03 22:25:36 crc kubenswrapper[4830]: I1203 22:25:36.766022 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerStarted","Data":"27139c88accd5c1abc4ea51a562d002e0f7bae4858086f81eba8a195adb9c643"} Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.250040 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.360887 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb\") pod \"8958b2b0-be2d-4030-90cf-5e4335a17a50\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.360951 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc\") pod \"8958b2b0-be2d-4030-90cf-5e4335a17a50\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.360970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb\") pod \"8958b2b0-be2d-4030-90cf-5e4335a17a50\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.361014 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj7v8\" (UniqueName: \"kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8\") pod \"8958b2b0-be2d-4030-90cf-5e4335a17a50\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.361152 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config\") pod \"8958b2b0-be2d-4030-90cf-5e4335a17a50\" (UID: \"8958b2b0-be2d-4030-90cf-5e4335a17a50\") " Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.366164 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8" (OuterVolumeSpecName: "kube-api-access-zj7v8") pod "8958b2b0-be2d-4030-90cf-5e4335a17a50" (UID: "8958b2b0-be2d-4030-90cf-5e4335a17a50"). InnerVolumeSpecName "kube-api-access-zj7v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.409137 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8958b2b0-be2d-4030-90cf-5e4335a17a50" (UID: "8958b2b0-be2d-4030-90cf-5e4335a17a50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.409241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8958b2b0-be2d-4030-90cf-5e4335a17a50" (UID: "8958b2b0-be2d-4030-90cf-5e4335a17a50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.409272 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8958b2b0-be2d-4030-90cf-5e4335a17a50" (UID: "8958b2b0-be2d-4030-90cf-5e4335a17a50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.422036 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config" (OuterVolumeSpecName: "config") pod "8958b2b0-be2d-4030-90cf-5e4335a17a50" (UID: "8958b2b0-be2d-4030-90cf-5e4335a17a50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.464178 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.464215 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.464239 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.464251 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8958b2b0-be2d-4030-90cf-5e4335a17a50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.464263 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj7v8\" (UniqueName: \"kubernetes.io/projected/8958b2b0-be2d-4030-90cf-5e4335a17a50-kube-api-access-zj7v8\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.788992 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" event={"ID":"8958b2b0-be2d-4030-90cf-5e4335a17a50","Type":"ContainerDied","Data":"4cf7234a3abbec2013eb904ad3a315928617c0fbd64584be533bcf550e997bda"} Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.789053 4830 scope.go:117] "RemoveContainer" containerID="a4022806dfc4acf5aec0d01837fa9e5e5b92ff824173d5c55f40775e02437b5f" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.789173 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hk62l" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.796992 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wvw9q" event={"ID":"19889054-44cb-47a4-a604-a319f1bd25af","Type":"ContainerStarted","Data":"3b42adcd02a75a22a113cef0a4317f8de4e97e8d49d22b2d24a52d4b03a9c235"} Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.829783 4830 scope.go:117] "RemoveContainer" containerID="6da1c4b0907b956e1cfd4d00247a7e97acb548d1bde971552bb69313b34c0777" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.834214 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wvw9q" podStartSLOduration=6.64688992 podStartE2EDuration="10.834192158s" podCreationTimestamp="2025-12-03 22:25:28 +0000 UTC" firstStartedPulling="2025-12-03 22:25:33.697501881 +0000 UTC m=+1222.693963270" lastFinishedPulling="2025-12-03 22:25:37.884804149 +0000 UTC m=+1226.881265508" observedRunningTime="2025-12-03 22:25:38.829890961 +0000 UTC m=+1227.826352330" watchObservedRunningTime="2025-12-03 22:25:38.834192158 +0000 UTC m=+1227.830653537" Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.857681 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:38 crc kubenswrapper[4830]: I1203 22:25:38.865892 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hk62l"] Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.170250 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8e4a-account-create-update-65bgn"] Dec 03 22:25:39 crc kubenswrapper[4830]: E1203 22:25:39.170817 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="dnsmasq-dns" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.170830 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="dnsmasq-dns" Dec 03 22:25:39 crc kubenswrapper[4830]: E1203 22:25:39.170848 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="init" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.170854 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="init" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.171047 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" containerName="dnsmasq-dns" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.171672 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.173011 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.179740 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-skf7v"] Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.181278 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.193214 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8e4a-account-create-update-65bgn"] Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.215736 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-skf7v"] Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.293291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.293476 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwttf\" (UniqueName: \"kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.293666 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97dgh\" (UniqueName: \"kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.293804 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.366092 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8958b2b0-be2d-4030-90cf-5e4335a17a50" path="/var/lib/kubelet/pods/8958b2b0-be2d-4030-90cf-5e4335a17a50/volumes" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.411876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwttf\" (UniqueName: \"kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.411911 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.411950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97dgh\" (UniqueName: \"kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.412027 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.412718 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.413230 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.436011 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97dgh\" (UniqueName: \"kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh\") pod \"glance-8e4a-account-create-update-65bgn\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.436793 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwttf\" (UniqueName: \"kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf\") pod \"glance-db-create-skf7v\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " pod="openstack/glance-db-create-skf7v" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.497960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:39 crc kubenswrapper[4830]: I1203 22:25:39.509475 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-skf7v" Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.592053 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-skf7v"] Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.670776 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8e4a-account-create-update-65bgn"] Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.822605 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-skf7v" event={"ID":"74447e05-2f6d-441a-9a5a-b275cb318a91","Type":"ContainerStarted","Data":"9a459ab196641fd8ee1cab6be469ade4a83ac9f84fc13930d676caad58206ba1"} Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.822655 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-skf7v" event={"ID":"74447e05-2f6d-441a-9a5a-b275cb318a91","Type":"ContainerStarted","Data":"a1171622a1a22bafc58333b0e2a36ce3c8479a81906954db13b6cfe2df5429c7"} Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.824111 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e4a-account-create-update-65bgn" event={"ID":"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c","Type":"ContainerStarted","Data":"d724adf443aba4bd7b9d7cd91fbae10bb4065da91fb4fbb192bbb134c00a458e"} Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.827029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerStarted","Data":"cdcda774bd14de42e9e955e8b645bc4c809aadc6cbf62bc94d0018527ec5a70e"} Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.846612 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-skf7v" podStartSLOduration=2.84658435 podStartE2EDuration="2.84658435s" podCreationTimestamp="2025-12-03 22:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:41.842381255 +0000 UTC m=+1230.838842604" watchObservedRunningTime="2025-12-03 22:25:41.84658435 +0000 UTC m=+1230.843045739" Dec 03 22:25:41 crc kubenswrapper[4830]: I1203 22:25:41.884957 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=28.842311537 podStartE2EDuration="1m6.884941929s" podCreationTimestamp="2025-12-03 22:24:35 +0000 UTC" firstStartedPulling="2025-12-03 22:25:03.211371494 +0000 UTC m=+1192.207832843" lastFinishedPulling="2025-12-03 22:25:41.254001886 +0000 UTC m=+1230.250463235" observedRunningTime="2025-12-03 22:25:41.88206485 +0000 UTC m=+1230.878526209" watchObservedRunningTime="2025-12-03 22:25:41.884941929 +0000 UTC m=+1230.881403278" Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.001988 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.837418 4830 generic.go:334] "Generic (PLEG): container finished" podID="74447e05-2f6d-441a-9a5a-b275cb318a91" containerID="9a459ab196641fd8ee1cab6be469ade4a83ac9f84fc13930d676caad58206ba1" exitCode=0 Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.837538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-skf7v" event={"ID":"74447e05-2f6d-441a-9a5a-b275cb318a91","Type":"ContainerDied","Data":"9a459ab196641fd8ee1cab6be469ade4a83ac9f84fc13930d676caad58206ba1"} Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.839887 4830 generic.go:334] "Generic (PLEG): container finished" podID="caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" containerID="01d4117f8c2746ef4c1bab19aadcc265bc1db4c5c7b498fd49b5899811834aac" exitCode=0 Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.841358 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e4a-account-create-update-65bgn" event={"ID":"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c","Type":"ContainerDied","Data":"01d4117f8c2746ef4c1bab19aadcc265bc1db4c5c7b498fd49b5899811834aac"} Dec 03 22:25:42 crc kubenswrapper[4830]: I1203 22:25:42.878935 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:42 crc kubenswrapper[4830]: E1203 22:25:42.879538 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:25:42 crc kubenswrapper[4830]: E1203 22:25:42.879570 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:25:42 crc kubenswrapper[4830]: E1203 22:25:42.879655 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift podName:eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e nodeName:}" failed. No retries permitted until 2025-12-03 22:25:58.879628918 +0000 UTC m=+1247.876090297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift") pod "swift-storage-0" (UID: "eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e") : configmap "swift-ring-files" not found Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.472033 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gx7cl"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.474324 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.491136 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gx7cl"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.556037 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8d55-account-create-update-nqj5k"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.557551 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.560384 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.568960 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8d55-account-create-update-nqj5k"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.594559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.594609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvxj\" (UniqueName: \"kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.696033 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.696086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvxj\" (UniqueName: \"kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.696132 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsc2\" (UniqueName: \"kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.696166 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.696941 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.739023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvxj\" (UniqueName: \"kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj\") pod \"keystone-db-create-gx7cl\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.770568 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9hhqz"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.771829 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.781153 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9hhqz"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.798356 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsc2\" (UniqueName: \"kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.799046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.799667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.815277 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.830339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsc2\" (UniqueName: \"kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2\") pod \"keystone-8d55-account-create-update-nqj5k\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.879461 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.882795 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-89cc-account-create-update-vgvp8"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.883958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.888669 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.892696 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89cc-account-create-update-vgvp8"] Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.900925 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5mz\" (UniqueName: \"kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:43 crc kubenswrapper[4830]: I1203 22:25:43.902032 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.003456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.004402 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jjv\" (UniqueName: \"kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.004430 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.004463 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5mz\" (UniqueName: \"kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.004404 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.026686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5mz\" (UniqueName: \"kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz\") pod \"placement-db-create-9hhqz\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.095317 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.105958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jjv\" (UniqueName: \"kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.105992 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.106685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.120647 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jjv\" (UniqueName: \"kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv\") pod \"placement-89cc-account-create-update-vgvp8\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.259609 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.435623 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.438048 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gx7cl"] Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.465923 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-skf7v" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.511811 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts\") pod \"74447e05-2f6d-441a-9a5a-b275cb318a91\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.512014 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwttf\" (UniqueName: \"kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf\") pod \"74447e05-2f6d-441a-9a5a-b275cb318a91\" (UID: \"74447e05-2f6d-441a-9a5a-b275cb318a91\") " Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.512069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts\") pod \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.512172 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97dgh\" (UniqueName: \"kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh\") pod \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\" (UID: \"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c\") " Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.514457 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" (UID: "caf109e1-3c06-46c6-a0d2-5cd73bdbc98c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.517656 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74447e05-2f6d-441a-9a5a-b275cb318a91" (UID: "74447e05-2f6d-441a-9a5a-b275cb318a91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.518587 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh" (OuterVolumeSpecName: "kube-api-access-97dgh") pod "caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" (UID: "caf109e1-3c06-46c6-a0d2-5cd73bdbc98c"). InnerVolumeSpecName "kube-api-access-97dgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.521690 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf" (OuterVolumeSpecName: "kube-api-access-wwttf") pod "74447e05-2f6d-441a-9a5a-b275cb318a91" (UID: "74447e05-2f6d-441a-9a5a-b275cb318a91"). InnerVolumeSpecName "kube-api-access-wwttf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.614767 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwttf\" (UniqueName: \"kubernetes.io/projected/74447e05-2f6d-441a-9a5a-b275cb318a91-kube-api-access-wwttf\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.614805 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.614822 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97dgh\" (UniqueName: \"kubernetes.io/projected/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c-kube-api-access-97dgh\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.614832 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74447e05-2f6d-441a-9a5a-b275cb318a91-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.631773 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8d55-account-create-update-nqj5k"] Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.703467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9hhqz"] Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.801472 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nlxm7" podUID="6bdf507c-05be-4df8-8c33-85f15c05237c" containerName="ovn-controller" probeResult="failure" output=< Dec 03 22:25:44 crc kubenswrapper[4830]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 22:25:44 crc kubenswrapper[4830]: > Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.897519 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gx7cl" event={"ID":"b1431d7e-ecf6-4b69-891b-6522466fafb9","Type":"ContainerStarted","Data":"309fe05d8c874a61e3766d271069fa871f6efd3d65a01eca0df14ca6956893a2"} Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.898690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9hhqz" event={"ID":"561f4d1a-2ac0-4446-a60d-922905025583","Type":"ContainerStarted","Data":"5cb3c5e9fca7f079086c8d5998d50d731d3b80ab78898c53943ce7a355cdfbd0"} Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.900117 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d55-account-create-update-nqj5k" event={"ID":"a0df67c3-b98f-4294-9aa8-73ac7efb6b99","Type":"ContainerStarted","Data":"de078c5adf80211cfc68ccdcd590961fe1239816d1ac430e1cfd9b7d0d43d80b"} Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.900217 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.902781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-skf7v" event={"ID":"74447e05-2f6d-441a-9a5a-b275cb318a91","Type":"ContainerDied","Data":"a1171622a1a22bafc58333b0e2a36ce3c8479a81906954db13b6cfe2df5429c7"} Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.902807 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1171622a1a22bafc58333b0e2a36ce3c8479a81906954db13b6cfe2df5429c7" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.902828 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-skf7v" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.904657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8e4a-account-create-update-65bgn" event={"ID":"caf109e1-3c06-46c6-a0d2-5cd73bdbc98c","Type":"ContainerDied","Data":"d724adf443aba4bd7b9d7cd91fbae10bb4065da91fb4fbb192bbb134c00a458e"} Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.904690 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d724adf443aba4bd7b9d7cd91fbae10bb4065da91fb4fbb192bbb134c00a458e" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.904750 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8e4a-account-create-update-65bgn" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.912283 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8j27r" Dec 03 22:25:44 crc kubenswrapper[4830]: I1203 22:25:44.988691 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89cc-account-create-update-vgvp8"] Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.144407 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nlxm7-config-v42v4"] Dec 03 22:25:45 crc kubenswrapper[4830]: E1203 22:25:45.144800 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" containerName="mariadb-account-create-update" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.144813 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" containerName="mariadb-account-create-update" Dec 03 22:25:45 crc kubenswrapper[4830]: E1203 22:25:45.144826 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74447e05-2f6d-441a-9a5a-b275cb318a91" containerName="mariadb-database-create" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.144832 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="74447e05-2f6d-441a-9a5a-b275cb318a91" containerName="mariadb-database-create" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.144989 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="74447e05-2f6d-441a-9a5a-b275cb318a91" containerName="mariadb-database-create" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.145002 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" containerName="mariadb-account-create-update" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.145653 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.152112 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.159054 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlxm7-config-v42v4"] Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.227395 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45q2j\" (UniqueName: \"kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.227784 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.227821 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.227860 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.228100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.228199 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330155 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330198 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45q2j\" (UniqueName: \"kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330540 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330563 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.330555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.332035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.332354 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.356566 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45q2j\" (UniqueName: \"kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j\") pod \"ovn-controller-nlxm7-config-v42v4\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.467363 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.751072 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.916699 4830 generic.go:334] "Generic (PLEG): container finished" podID="19889054-44cb-47a4-a604-a319f1bd25af" containerID="3b42adcd02a75a22a113cef0a4317f8de4e97e8d49d22b2d24a52d4b03a9c235" exitCode=0 Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.916817 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wvw9q" event={"ID":"19889054-44cb-47a4-a604-a319f1bd25af","Type":"ContainerDied","Data":"3b42adcd02a75a22a113cef0a4317f8de4e97e8d49d22b2d24a52d4b03a9c235"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.919464 4830 generic.go:334] "Generic (PLEG): container finished" podID="561f4d1a-2ac0-4446-a60d-922905025583" containerID="931d028b345093106aeddc013ab15181e99804a7852b95d6fe5f29ad3f446c92" exitCode=0 Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.919588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9hhqz" event={"ID":"561f4d1a-2ac0-4446-a60d-922905025583","Type":"ContainerDied","Data":"931d028b345093106aeddc013ab15181e99804a7852b95d6fe5f29ad3f446c92"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.921856 4830 generic.go:334] "Generic (PLEG): container finished" podID="a0df67c3-b98f-4294-9aa8-73ac7efb6b99" containerID="4e43c513a887fc31d9893f9f16f61bbd0965d9f769e06d7765a8c818238b3325" exitCode=0 Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.921943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d55-account-create-update-nqj5k" event={"ID":"a0df67c3-b98f-4294-9aa8-73ac7efb6b99","Type":"ContainerDied","Data":"4e43c513a887fc31d9893f9f16f61bbd0965d9f769e06d7765a8c818238b3325"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.924435 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1431d7e-ecf6-4b69-891b-6522466fafb9" containerID="20cac797d0ba41b661a2bb06b3a8f213c3f1e241fd77bede695545c27f6a1243" exitCode=0 Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.924490 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gx7cl" event={"ID":"b1431d7e-ecf6-4b69-891b-6522466fafb9","Type":"ContainerDied","Data":"20cac797d0ba41b661a2bb06b3a8f213c3f1e241fd77bede695545c27f6a1243"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.925898 4830 generic.go:334] "Generic (PLEG): container finished" podID="766590ba-c5f6-426a-8562-bd3440bdbaa0" containerID="7e67ba5c6a8156a872c1de79b612ceb9edec00a2e0c19a4a325b34a372d65dca" exitCode=0 Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.926173 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89cc-account-create-update-vgvp8" event={"ID":"766590ba-c5f6-426a-8562-bd3440bdbaa0","Type":"ContainerDied","Data":"7e67ba5c6a8156a872c1de79b612ceb9edec00a2e0c19a4a325b34a372d65dca"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.926214 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89cc-account-create-update-vgvp8" event={"ID":"766590ba-c5f6-426a-8562-bd3440bdbaa0","Type":"ContainerStarted","Data":"87553e6049d4ba53ba6fcd3d3f803ec0d4cb8c86129f405c8c5d63863cae8837"} Dec 03 22:25:45 crc kubenswrapper[4830]: I1203 22:25:45.939457 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlxm7-config-v42v4"] Dec 03 22:25:46 crc kubenswrapper[4830]: I1203 22:25:46.937231 4830 generic.go:334] "Generic (PLEG): container finished" podID="ead028dc-bc30-4f4f-929e-e10348f62f36" containerID="53b55af998e0f4cdf29eda24fd09747ad3a661b5682c2892db5e024af9bbd87f" exitCode=0 Dec 03 22:25:46 crc kubenswrapper[4830]: I1203 22:25:46.937283 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlxm7-config-v42v4" event={"ID":"ead028dc-bc30-4f4f-929e-e10348f62f36","Type":"ContainerDied","Data":"53b55af998e0f4cdf29eda24fd09747ad3a661b5682c2892db5e024af9bbd87f"} Dec 03 22:25:46 crc kubenswrapper[4830]: I1203 22:25:46.937849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlxm7-config-v42v4" event={"ID":"ead028dc-bc30-4f4f-929e-e10348f62f36","Type":"ContainerStarted","Data":"9b25172f4b7e729c2d8117ff3f6371a170b5dfe130edc2b9895580d0d6eadfef"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.480347 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.572078 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jjv\" (UniqueName: \"kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv\") pod \"766590ba-c5f6-426a-8562-bd3440bdbaa0\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.572467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts\") pod \"766590ba-c5f6-426a-8562-bd3440bdbaa0\" (UID: \"766590ba-c5f6-426a-8562-bd3440bdbaa0\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.573155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "766590ba-c5f6-426a-8562-bd3440bdbaa0" (UID: "766590ba-c5f6-426a-8562-bd3440bdbaa0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.576953 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv" (OuterVolumeSpecName: "kube-api-access-k5jjv") pod "766590ba-c5f6-426a-8562-bd3440bdbaa0" (UID: "766590ba-c5f6-426a-8562-bd3440bdbaa0"). InnerVolumeSpecName "kube-api-access-k5jjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.614875 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.620971 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.629821 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.631876 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.674729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsc2\" (UniqueName: \"kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2\") pod \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.674798 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts\") pod \"561f4d1a-2ac0-4446-a60d-922905025583\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.674877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.674934 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts\") pod \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\" (UID: \"a0df67c3-b98f-4294-9aa8-73ac7efb6b99\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675011 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts\") pod \"b1431d7e-ecf6-4b69-891b-6522466fafb9\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675057 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whvxj\" (UniqueName: \"kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj\") pod \"b1431d7e-ecf6-4b69-891b-6522466fafb9\" (UID: \"b1431d7e-ecf6-4b69-891b-6522466fafb9\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675093 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5mz\" (UniqueName: \"kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz\") pod \"561f4d1a-2ac0-4446-a60d-922905025583\" (UID: \"561f4d1a-2ac0-4446-a60d-922905025583\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675128 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675157 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675190 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qsb6\" (UniqueName: \"kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675233 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle\") pod \"19889054-44cb-47a4-a604-a319f1bd25af\" (UID: \"19889054-44cb-47a4-a604-a319f1bd25af\") " Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675919 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jjv\" (UniqueName: \"kubernetes.io/projected/766590ba-c5f6-426a-8562-bd3440bdbaa0-kube-api-access-k5jjv\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.675935 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766590ba-c5f6-426a-8562-bd3440bdbaa0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.676887 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1431d7e-ecf6-4b69-891b-6522466fafb9" (UID: "b1431d7e-ecf6-4b69-891b-6522466fafb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.677007 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0df67c3-b98f-4294-9aa8-73ac7efb6b99" (UID: "a0df67c3-b98f-4294-9aa8-73ac7efb6b99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.677146 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.677663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "561f4d1a-2ac0-4446-a60d-922905025583" (UID: "561f4d1a-2ac0-4446-a60d-922905025583"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.679828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.680193 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj" (OuterVolumeSpecName: "kube-api-access-whvxj") pod "b1431d7e-ecf6-4b69-891b-6522466fafb9" (UID: "b1431d7e-ecf6-4b69-891b-6522466fafb9"). InnerVolumeSpecName "kube-api-access-whvxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.681121 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2" (OuterVolumeSpecName: "kube-api-access-xfsc2") pod "a0df67c3-b98f-4294-9aa8-73ac7efb6b99" (UID: "a0df67c3-b98f-4294-9aa8-73ac7efb6b99"). InnerVolumeSpecName "kube-api-access-xfsc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.682168 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz" (OuterVolumeSpecName: "kube-api-access-tg5mz") pod "561f4d1a-2ac0-4446-a60d-922905025583" (UID: "561f4d1a-2ac0-4446-a60d-922905025583"). InnerVolumeSpecName "kube-api-access-tg5mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.686606 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6" (OuterVolumeSpecName: "kube-api-access-9qsb6") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "kube-api-access-9qsb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.702907 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts" (OuterVolumeSpecName: "scripts") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.719720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.722248 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.735540 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "19889054-44cb-47a4-a604-a319f1bd25af" (UID: "19889054-44cb-47a4-a604-a319f1bd25af"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777150 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsc2\" (UniqueName: \"kubernetes.io/projected/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-kube-api-access-xfsc2\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777182 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/561f4d1a-2ac0-4446-a60d-922905025583-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777192 4830 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777201 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0df67c3-b98f-4294-9aa8-73ac7efb6b99-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777209 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1431d7e-ecf6-4b69-891b-6522466fafb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777218 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whvxj\" (UniqueName: \"kubernetes.io/projected/b1431d7e-ecf6-4b69-891b-6522466fafb9-kube-api-access-whvxj\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777226 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg5mz\" (UniqueName: \"kubernetes.io/projected/561f4d1a-2ac0-4446-a60d-922905025583-kube-api-access-tg5mz\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777235 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19889054-44cb-47a4-a604-a319f1bd25af-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777243 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19889054-44cb-47a4-a604-a319f1bd25af-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777251 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qsb6\" (UniqueName: \"kubernetes.io/projected/19889054-44cb-47a4-a604-a319f1bd25af-kube-api-access-9qsb6\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777260 4830 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777269 4830 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.777277 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19889054-44cb-47a4-a604-a319f1bd25af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.952929 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wvw9q" event={"ID":"19889054-44cb-47a4-a604-a319f1bd25af","Type":"ContainerDied","Data":"4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.952982 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f34ace84aa1f4d638b70b9e709c846fa37ed296b66ff9b2d1af06a1d6ee3395" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.953090 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wvw9q" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.955061 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9hhqz" event={"ID":"561f4d1a-2ac0-4446-a60d-922905025583","Type":"ContainerDied","Data":"5cb3c5e9fca7f079086c8d5998d50d731d3b80ab78898c53943ce7a355cdfbd0"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.955148 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cb3c5e9fca7f079086c8d5998d50d731d3b80ab78898c53943ce7a355cdfbd0" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.955229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9hhqz" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.957351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8d55-account-create-update-nqj5k" event={"ID":"a0df67c3-b98f-4294-9aa8-73ac7efb6b99","Type":"ContainerDied","Data":"de078c5adf80211cfc68ccdcd590961fe1239816d1ac430e1cfd9b7d0d43d80b"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.957436 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de078c5adf80211cfc68ccdcd590961fe1239816d1ac430e1cfd9b7d0d43d80b" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.957629 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8d55-account-create-update-nqj5k" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.959776 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gx7cl" event={"ID":"b1431d7e-ecf6-4b69-891b-6522466fafb9","Type":"ContainerDied","Data":"309fe05d8c874a61e3766d271069fa871f6efd3d65a01eca0df14ca6956893a2"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.959816 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="309fe05d8c874a61e3766d271069fa871f6efd3d65a01eca0df14ca6956893a2" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.959819 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gx7cl" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.961744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89cc-account-create-update-vgvp8" event={"ID":"766590ba-c5f6-426a-8562-bd3440bdbaa0","Type":"ContainerDied","Data":"87553e6049d4ba53ba6fcd3d3f803ec0d4cb8c86129f405c8c5d63863cae8837"} Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.961785 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87553e6049d4ba53ba6fcd3d3f803ec0d4cb8c86129f405c8c5d63863cae8837" Dec 03 22:25:47 crc kubenswrapper[4830]: I1203 22:25:47.961857 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89cc-account-create-update-vgvp8" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.274711 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388691 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388753 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45q2j\" (UniqueName: \"kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.388868 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn\") pod \"ead028dc-bc30-4f4f-929e-e10348f62f36\" (UID: \"ead028dc-bc30-4f4f-929e-e10348f62f36\") " Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.389334 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.389370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.390040 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.390689 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts" (OuterVolumeSpecName: "scripts") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.395608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run" (OuterVolumeSpecName: "var-run") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.397712 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j" (OuterVolumeSpecName: "kube-api-access-45q2j") pod "ead028dc-bc30-4f4f-929e-e10348f62f36" (UID: "ead028dc-bc30-4f4f-929e-e10348f62f36"). InnerVolumeSpecName "kube-api-access-45q2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490136 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490165 4830 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490175 4830 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ead028dc-bc30-4f4f-929e-e10348f62f36-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490183 4830 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490202 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ead028dc-bc30-4f4f-929e-e10348f62f36-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.490209 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45q2j\" (UniqueName: \"kubernetes.io/projected/ead028dc-bc30-4f4f-929e-e10348f62f36-kube-api-access-45q2j\") on node \"crc\" DevicePath \"\"" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.970969 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlxm7-config-v42v4" event={"ID":"ead028dc-bc30-4f4f-929e-e10348f62f36","Type":"ContainerDied","Data":"9b25172f4b7e729c2d8117ff3f6371a170b5dfe130edc2b9895580d0d6eadfef"} Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.971006 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b25172f4b7e729c2d8117ff3f6371a170b5dfe130edc2b9895580d0d6eadfef" Dec 03 22:25:48 crc kubenswrapper[4830]: I1203 22:25:48.971021 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlxm7-config-v42v4" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.425615 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-75hs2"] Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.425948 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0df67c3-b98f-4294-9aa8-73ac7efb6b99" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.425961 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0df67c3-b98f-4294-9aa8-73ac7efb6b99" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.425978 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19889054-44cb-47a4-a604-a319f1bd25af" containerName="swift-ring-rebalance" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.425984 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="19889054-44cb-47a4-a604-a319f1bd25af" containerName="swift-ring-rebalance" Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.425996 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1431d7e-ecf6-4b69-891b-6522466fafb9" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426003 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1431d7e-ecf6-4b69-891b-6522466fafb9" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.426012 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561f4d1a-2ac0-4446-a60d-922905025583" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426018 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="561f4d1a-2ac0-4446-a60d-922905025583" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.426028 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766590ba-c5f6-426a-8562-bd3440bdbaa0" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426035 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="766590ba-c5f6-426a-8562-bd3440bdbaa0" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: E1203 22:25:49.426052 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead028dc-bc30-4f4f-929e-e10348f62f36" containerName="ovn-config" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426058 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead028dc-bc30-4f4f-929e-e10348f62f36" containerName="ovn-config" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426209 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="561f4d1a-2ac0-4446-a60d-922905025583" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426220 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="766590ba-c5f6-426a-8562-bd3440bdbaa0" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426232 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead028dc-bc30-4f4f-929e-e10348f62f36" containerName="ovn-config" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426239 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0df67c3-b98f-4294-9aa8-73ac7efb6b99" containerName="mariadb-account-create-update" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426249 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="19889054-44cb-47a4-a604-a319f1bd25af" containerName="swift-ring-rebalance" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.426263 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1431d7e-ecf6-4b69-891b-6522466fafb9" containerName="mariadb-database-create" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.428188 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.430478 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.431370 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hc559" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.453912 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-75hs2"] Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.489218 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nlxm7-config-v42v4"] Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.495424 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nlxm7-config-v42v4"] Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.507374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.507649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.507782 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvct2\" (UniqueName: \"kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.507887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.609244 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.609548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.609657 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvct2\" (UniqueName: \"kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.609770 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.623324 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.623451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.630557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.634470 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvct2\" (UniqueName: \"kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2\") pod \"glance-db-sync-75hs2\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.748589 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-75hs2" Dec 03 22:25:49 crc kubenswrapper[4830]: I1203 22:25:49.794072 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nlxm7" Dec 03 22:25:51 crc kubenswrapper[4830]: I1203 22:25:50.392467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-75hs2"] Dec 03 22:25:51 crc kubenswrapper[4830]: I1203 22:25:50.990158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-75hs2" event={"ID":"b445f62c-c9c8-488f-ad3c-4fd162cb1092","Type":"ContainerStarted","Data":"8a2590b8e9824347c7a007f48cc03a7c09dcc896105ddaeff6d7fbfb4159b6cc"} Dec 03 22:25:51 crc kubenswrapper[4830]: I1203 22:25:51.355096 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead028dc-bc30-4f4f-929e-e10348f62f36" path="/var/lib/kubelet/pods/ead028dc-bc30-4f4f-929e-e10348f62f36/volumes" Dec 03 22:25:52 crc kubenswrapper[4830]: I1203 22:25:52.001742 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 22:25:52 crc kubenswrapper[4830]: I1203 22:25:52.004366 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 22:25:53 crc kubenswrapper[4830]: I1203 22:25:53.012058 4830 generic.go:334] "Generic (PLEG): container finished" podID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerID="b770f5ff20096e432c2f76473a1a5de1ea08c80b6cf3d629c1285e3a33deff49" exitCode=0 Dec 03 22:25:53 crc kubenswrapper[4830]: I1203 22:25:53.012430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerDied","Data":"b770f5ff20096e432c2f76473a1a5de1ea08c80b6cf3d629c1285e3a33deff49"} Dec 03 22:25:53 crc kubenswrapper[4830]: I1203 22:25:53.015997 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerID="fea7ef4434389728b9f81d8aa6ccd5a9ac43c0d7acae08c6c3292151a1167cff" exitCode=0 Dec 03 22:25:53 crc kubenswrapper[4830]: I1203 22:25:53.016120 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerDied","Data":"fea7ef4434389728b9f81d8aa6ccd5a9ac43c0d7acae08c6c3292151a1167cff"} Dec 03 22:25:53 crc kubenswrapper[4830]: I1203 22:25:53.018380 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.027160 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerStarted","Data":"f1c3050649d45ba05f8c7e16d94979bee4c4861b21f455826e9557e3bcc7ac7e"} Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.027793 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.031579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerStarted","Data":"accd5ab84c124d6c490627992a78ee18f80188c66c763bb70f6a9af31dbbe222"} Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.031770 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.059738 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.258465863 podStartE2EDuration="1m25.059720626s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="2025-12-03 22:24:32.208938164 +0000 UTC m=+1161.205399503" lastFinishedPulling="2025-12-03 22:25:16.010192907 +0000 UTC m=+1205.006654266" observedRunningTime="2025-12-03 22:25:54.052450779 +0000 UTC m=+1243.048912128" watchObservedRunningTime="2025-12-03 22:25:54.059720626 +0000 UTC m=+1243.056181975" Dec 03 22:25:54 crc kubenswrapper[4830]: I1203 22:25:54.073636 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.660419312 podStartE2EDuration="1m26.073620204s" podCreationTimestamp="2025-12-03 22:24:28 +0000 UTC" firstStartedPulling="2025-12-03 22:24:30.598013073 +0000 UTC m=+1159.594474422" lastFinishedPulling="2025-12-03 22:25:16.011213965 +0000 UTC m=+1205.007675314" observedRunningTime="2025-12-03 22:25:54.073327366 +0000 UTC m=+1243.069788715" watchObservedRunningTime="2025-12-03 22:25:54.073620204 +0000 UTC m=+1243.070081553" Dec 03 22:25:55 crc kubenswrapper[4830]: I1203 22:25:55.568945 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:25:55 crc kubenswrapper[4830]: I1203 22:25:55.569566 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" containerID="cri-o://19b361512fa4c6f017b9fc0c350f7985f647adac000fa04cc66280d88048adef" gracePeriod=600 Dec 03 22:25:55 crc kubenswrapper[4830]: I1203 22:25:55.569598 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="config-reloader" containerID="cri-o://27139c88accd5c1abc4ea51a562d002e0f7bae4858086f81eba8a195adb9c643" gracePeriod=600 Dec 03 22:25:55 crc kubenswrapper[4830]: I1203 22:25:55.569626 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="thanos-sidecar" containerID="cri-o://cdcda774bd14de42e9e955e8b645bc4c809aadc6cbf62bc94d0018527ec5a70e" gracePeriod=600 Dec 03 22:25:55 crc kubenswrapper[4830]: I1203 22:25:55.741453 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 22:25:56 crc kubenswrapper[4830]: I1203 22:25:56.681284 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:25:56 crc kubenswrapper[4830]: I1203 22:25:56.681608 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:25:56 crc kubenswrapper[4830]: I1203 22:25:56.681658 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:25:56 crc kubenswrapper[4830]: I1203 22:25:56.682426 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:25:56 crc kubenswrapper[4830]: I1203 22:25:56.682482 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453" gracePeriod=600 Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.063861 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453" exitCode=0 Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.063948 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453"} Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.064187 4830 scope.go:117] "RemoveContainer" containerID="942bb799e68a31057e858be496e721bb353443055eb7deda485ab32976586b59" Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068659 4830 generic.go:334] "Generic (PLEG): container finished" podID="e38be206-c963-42f9-834d-a9263b18cbed" containerID="cdcda774bd14de42e9e955e8b645bc4c809aadc6cbf62bc94d0018527ec5a70e" exitCode=0 Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068688 4830 generic.go:334] "Generic (PLEG): container finished" podID="e38be206-c963-42f9-834d-a9263b18cbed" containerID="27139c88accd5c1abc4ea51a562d002e0f7bae4858086f81eba8a195adb9c643" exitCode=0 Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068697 4830 generic.go:334] "Generic (PLEG): container finished" podID="e38be206-c963-42f9-834d-a9263b18cbed" containerID="19b361512fa4c6f017b9fc0c350f7985f647adac000fa04cc66280d88048adef" exitCode=0 Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068735 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerDied","Data":"cdcda774bd14de42e9e955e8b645bc4c809aadc6cbf62bc94d0018527ec5a70e"} Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068784 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerDied","Data":"27139c88accd5c1abc4ea51a562d002e0f7bae4858086f81eba8a195adb9c643"} Dec 03 22:25:57 crc kubenswrapper[4830]: I1203 22:25:57.068795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerDied","Data":"19b361512fa4c6f017b9fc0c350f7985f647adac000fa04cc66280d88048adef"} Dec 03 22:25:58 crc kubenswrapper[4830]: I1203 22:25:58.891422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:58 crc kubenswrapper[4830]: I1203 22:25:58.902679 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e-etc-swift\") pod \"swift-storage-0\" (UID: \"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e\") " pod="openstack/swift-storage-0" Dec 03 22:25:59 crc kubenswrapper[4830]: I1203 22:25:59.085865 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 22:26:00 crc kubenswrapper[4830]: I1203 22:26:00.004394 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.178467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e38be206-c963-42f9-834d-a9263b18cbed","Type":"ContainerDied","Data":"822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9"} Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.179866 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822cbefceb53664e8ab733416702583587f7c55bc6bf713917d272c9e75ad7c9" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.188536 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344202 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344620 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7vt2\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344665 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344693 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344724 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.344979 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.345069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.345138 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out\") pod \"e38be206-c963-42f9-834d-a9263b18cbed\" (UID: \"e38be206-c963-42f9-834d-a9263b18cbed\") " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.349136 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.351182 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out" (OuterVolumeSpecName: "config-out") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.354304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config" (OuterVolumeSpecName: "config") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.359958 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.371251 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.372399 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2" (OuterVolumeSpecName: "kube-api-access-h7vt2") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "kube-api-access-h7vt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.424664 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.428007 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config" (OuterVolumeSpecName: "web-config") pod "e38be206-c963-42f9-834d-a9263b18cbed" (UID: "e38be206-c963-42f9-834d-a9263b18cbed"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.448785 4830 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.448961 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7vt2\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-kube-api-access-h7vt2\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449037 4830 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449133 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e38be206-c963-42f9-834d-a9263b18cbed-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449206 4830 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e38be206-c963-42f9-834d-a9263b18cbed-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449302 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") on node \"crc\" " Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449379 4830 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e38be206-c963-42f9-834d-a9263b18cbed-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.449433 4830 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e38be206-c963-42f9-834d-a9263b18cbed-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.478097 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.478252 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e") on node "crc" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.551754 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:04 crc kubenswrapper[4830]: I1203 22:26:04.772385 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.002792 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.192116 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"3bd6fabec591d425f41afb38019834811f4544d26862fd24d42036a1c6e0182d"} Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.195726 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743"} Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.198230 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.203202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-75hs2" event={"ID":"b445f62c-c9c8-488f-ad3c-4fd162cb1092","Type":"ContainerStarted","Data":"dfa4d1d9a173f4260396b1f874bc8f68e8891bf73bc1bc8071a97e792dab79ad"} Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.257941 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-75hs2" podStartSLOduration=2.45293683 podStartE2EDuration="16.257922796s" podCreationTimestamp="2025-12-03 22:25:49 +0000 UTC" firstStartedPulling="2025-12-03 22:25:50.416763274 +0000 UTC m=+1239.413224623" lastFinishedPulling="2025-12-03 22:26:04.22174924 +0000 UTC m=+1253.218210589" observedRunningTime="2025-12-03 22:26:05.247432201 +0000 UTC m=+1254.243893590" watchObservedRunningTime="2025-12-03 22:26:05.257922796 +0000 UTC m=+1254.254384145" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.286673 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.308645 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.327710 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:26:05 crc kubenswrapper[4830]: E1203 22:26:05.328160 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="config-reloader" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328184 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="config-reloader" Dec 03 22:26:05 crc kubenswrapper[4830]: E1203 22:26:05.328198 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="thanos-sidecar" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328219 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="thanos-sidecar" Dec 03 22:26:05 crc kubenswrapper[4830]: E1203 22:26:05.328243 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="init-config-reloader" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328253 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="init-config-reloader" Dec 03 22:26:05 crc kubenswrapper[4830]: E1203 22:26:05.328274 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328282 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328498 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="prometheus" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328539 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="config-reloader" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.328556 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38be206-c963-42f9-834d-a9263b18cbed" containerName="thanos-sidecar" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.330594 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.339568 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.339746 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.340098 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.340169 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.340192 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.340601 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kmlv2" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.354100 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38be206-c963-42f9-834d-a9263b18cbed" path="/var/lib/kubelet/pods/e38be206-c963-42f9-834d-a9263b18cbed/volumes" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.354920 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.356091 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467527 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467608 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467642 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/86b4b71c-3ed3-4413-a90b-4523b4a5c549-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467663 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467740 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467762 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lghf\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-kube-api-access-5lghf\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467779 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.467803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569143 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569199 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569224 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569250 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lghf\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-kube-api-access-5lghf\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569294 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569408 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569435 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/86b4b71c-3ed3-4413-a90b-4523b4a5c549-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.569455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.571940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/86b4b71c-3ed3-4413-a90b-4523b4a5c549-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.573642 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.573882 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/76467b962fce3d483edcc8f8bee5aee42e3f124190cf9bb015fb947a94c7c8bb/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.575894 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.576375 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.576888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.578144 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.578618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.578980 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.580859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.598231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/86b4b71c-3ed3-4413-a90b-4523b4a5c549-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.598587 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lghf\" (UniqueName: \"kubernetes.io/projected/86b4b71c-3ed3-4413-a90b-4523b4a5c549-kube-api-access-5lghf\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.618978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a752c9aa-2539-40d9-a1dc-cb96a004dc9e\") pod \"prometheus-metric-storage-0\" (UID: \"86b4b71c-3ed3-4413-a90b-4523b4a5c549\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.678310 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:05 crc kubenswrapper[4830]: I1203 22:26:05.741537 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="09564097-60ae-4b1d-bd03-ba8b5a254167" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 22:26:06 crc kubenswrapper[4830]: I1203 22:26:06.425047 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:26:06 crc kubenswrapper[4830]: W1203 22:26:06.442159 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86b4b71c_3ed3_4413_a90b_4523b4a5c549.slice/crio-9a11d070c7e7535707a9ad845e868a737bb98516842b774d1a6fd98d0728888a WatchSource:0}: Error finding container 9a11d070c7e7535707a9ad845e868a737bb98516842b774d1a6fd98d0728888a: Status 404 returned error can't find the container with id 9a11d070c7e7535707a9ad845e868a737bb98516842b774d1a6fd98d0728888a Dec 03 22:26:07 crc kubenswrapper[4830]: I1203 22:26:07.225256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"b9578fae2c7d7cade25652796739e39046e6807566414b0bb85a1d55ce06b34a"} Dec 03 22:26:07 crc kubenswrapper[4830]: I1203 22:26:07.225745 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"62619f0369014452ca534955721f55b5be7adf48ed7ef3ab99212b8de08e6ca6"} Dec 03 22:26:07 crc kubenswrapper[4830]: I1203 22:26:07.227830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerStarted","Data":"9a11d070c7e7535707a9ad845e868a737bb98516842b774d1a6fd98d0728888a"} Dec 03 22:26:08 crc kubenswrapper[4830]: I1203 22:26:08.238532 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"076c01802c9ab53d5e262e07342efd68f7f73eb8cb0cc5d57ef6bf54343f9546"} Dec 03 22:26:08 crc kubenswrapper[4830]: I1203 22:26:08.238879 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"d3c5af0d4f7e7f7f156e3148b50e19cca4a7f22cb62e7dcd2272bb01eafbd45e"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.264608 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"61b4303f2dc6c6ba0e993987cc7abd24e41aa841d1f96e844d5882ee665b8f39"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.265165 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"7b39d319857e524508e7b2cd9cfe6ef1b4823241dd6124b4f3a4fb3d3171539f"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.265176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"a9b54566afcbfc1ea0aef2eab0aa108e8701482fc5c4af3749b2f4c84ed9f55f"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.265185 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"831bc2d9133d819034018b1c5c1c23d8eb0e6a26385f649abc639235e9cf0624"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.267747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerStarted","Data":"11c152810e3c9c9f455e71a9a36ff1ed2fb6cd873d9093b71253631afff9ef4c"} Dec 03 22:26:10 crc kubenswrapper[4830]: I1203 22:26:10.311744 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:26:11 crc kubenswrapper[4830]: I1203 22:26:11.662708 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.199183 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-w9fdb"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.200664 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.209450 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w9fdb"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.292167 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pkmrj"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.293304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.301747 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bjj\" (UniqueName: \"kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.301798 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.302394 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pkmrj"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.328324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"ccc6c1429a30ac09eeb695a5f092129b021f8873b258ba47e60bf28e499a65bf"} Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.328364 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"36a75fa6c0155bad5d57a29159df7a9cc5c040e6904a20f60edd0449fdcdc54e"} Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.328379 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"edad45a9a987d57896b5ef3cf99778f1ab86744dad3e4f635164157a888ad927"} Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.393875 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3133-account-create-update-2jnjh"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.395329 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.400261 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.403342 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bjj\" (UniqueName: \"kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.403415 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.403468 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.403492 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxszk\" (UniqueName: \"kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.405415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.407881 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3133-account-create-update-2jnjh"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.445424 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bjj\" (UniqueName: \"kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj\") pod \"cinder-db-create-w9fdb\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.491038 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-qrsbk"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.492547 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.497423 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-58de-account-create-update-nxbc7"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.498517 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.500886 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.504548 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jmt\" (UniqueName: \"kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.504604 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.504926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.504989 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxszk\" (UniqueName: \"kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.506147 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.518656 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-qrsbk"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.524519 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxszk\" (UniqueName: \"kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk\") pod \"barbican-db-create-pkmrj\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.536667 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-58de-account-create-update-nxbc7"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.572112 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.601536 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6j6q9"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.602736 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607268 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jmt\" (UniqueName: \"kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607333 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4m5m\" (UniqueName: \"kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607444 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.607500 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltspv\" (UniqueName: \"kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.608431 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.609192 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6cfjx" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.609454 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.609569 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.609668 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.618965 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.639108 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6j6q9"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.651759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jmt\" (UniqueName: \"kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt\") pod \"cinder-3133-account-create-update-2jnjh\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.673627 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xdgf6"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.674847 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.703713 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xdgf6"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709585 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709646 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltspv\" (UniqueName: \"kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709706 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zds78\" (UniqueName: \"kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709768 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4m5m\" (UniqueName: \"kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.709799 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.710485 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.720023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.727124 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.748058 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4m5m\" (UniqueName: \"kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m\") pod \"barbican-58de-account-create-update-nxbc7\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.763905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltspv\" (UniqueName: \"kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv\") pod \"cloudkitty-db-create-qrsbk\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.813752 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zds78\" (UniqueName: \"kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.813827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.813895 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.813973 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9pgz\" (UniqueName: \"kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.814003 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.822249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.832173 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.835689 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3f6b-account-create-update-sbkkw"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.836855 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.843529 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3f6b-account-create-update-sbkkw"] Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.861282 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.867136 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.869214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.897950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zds78\" (UniqueName: \"kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78\") pod \"keystone-db-sync-6j6q9\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.920669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.920721 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9pgz\" (UniqueName: \"kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.920991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.921291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfxb\" (UniqueName: \"kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.921665 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:12 crc kubenswrapper[4830]: I1203 22:26:12.951611 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9pgz\" (UniqueName: \"kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz\") pod \"neutron-db-create-xdgf6\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.007158 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.015487 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-aa18-account-create-update-l7lnv"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.023933 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.024433 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.024579 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfxb\" (UniqueName: \"kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.025264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.045203 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-aa18-account-create-update-l7lnv"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.050458 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.054499 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfxb\" (UniqueName: \"kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb\") pod \"neutron-3f6b-account-create-update-sbkkw\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.126693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45q48\" (UniqueName: \"kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.126736 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.163304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.196467 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.228147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45q48\" (UniqueName: \"kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.228187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.228902 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.256171 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45q48\" (UniqueName: \"kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48\") pod \"cloudkitty-aa18-account-create-update-l7lnv\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.361249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"097c39dcd5b24c29121ba25c9bf46e62a1df65f995347b428338cb094b0fa167"} Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.361289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"d787dd8d98d0206b71a9ae0a3e0632e27a7f28ca755fbe5729b850d84e2e3029"} Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.385737 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w9fdb"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.392416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.506779 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pkmrj"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.727276 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-qrsbk"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.757554 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-58de-account-create-update-nxbc7"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.807825 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3133-account-create-update-2jnjh"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.884187 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6j6q9"] Dec 03 22:26:13 crc kubenswrapper[4830]: I1203 22:26:13.977453 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xdgf6"] Dec 03 22:26:14 crc kubenswrapper[4830]: W1203 22:26:14.000173 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15003660_6e4d_427f_8386_f4be36fc25f8.slice/crio-c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960 WatchSource:0}: Error finding container c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960: Status 404 returned error can't find the container with id c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960 Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.131541 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3f6b-account-create-update-sbkkw"] Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.269046 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-aa18-account-create-update-l7lnv"] Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.369223 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdgf6" event={"ID":"15003660-6e4d-427f-8386-f4be36fc25f8","Type":"ContainerStarted","Data":"c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.370488 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f6b-account-create-update-sbkkw" event={"ID":"5c30a279-b67f-46c3-980e-e38b9cc27eb9","Type":"ContainerStarted","Data":"a67d4a13642504a6a913a61fdccdf71cbeb6c8ec64bfd465a6644d8d385e8e6b"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.372012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58de-account-create-update-nxbc7" event={"ID":"ece3f713-2d71-4b1f-a20c-504f9e2dec24","Type":"ContainerStarted","Data":"1b348f93f5e718fd1311fcce5b4054d2a45bafdbfb9837350e4b193e50f7aeb7"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.372081 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58de-account-create-update-nxbc7" event={"ID":"ece3f713-2d71-4b1f-a20c-504f9e2dec24","Type":"ContainerStarted","Data":"51c427da3f2c060e1ea24e1fc728e1c9e40a7447a3696e5d95adbacdbd9a47dd"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.373430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-qrsbk" event={"ID":"2fd0383c-30e5-4787-9d20-fe1c5de32b62","Type":"ContainerStarted","Data":"0ed64c44fe8b23d64eec54eb01a7a7bced72448258e02568a28dd674e97f77e0"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.373471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-qrsbk" event={"ID":"2fd0383c-30e5-4787-9d20-fe1c5de32b62","Type":"ContainerStarted","Data":"690f50ffbf9b781a99846ed77cb5d39504d78ac46df77abdc2c5761dd6ffc6ed"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.375243 4830 generic.go:334] "Generic (PLEG): container finished" podID="5ba568a6-43aa-4368-a33f-50b182c1faf8" containerID="ea1df43caaca0ac62e8a3f3f6c90cf718bce8a0d68992aabab8cd0084c1eadf5" exitCode=0 Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.375281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9fdb" event={"ID":"5ba568a6-43aa-4368-a33f-50b182c1faf8","Type":"ContainerDied","Data":"ea1df43caaca0ac62e8a3f3f6c90cf718bce8a0d68992aabab8cd0084c1eadf5"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.375501 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9fdb" event={"ID":"5ba568a6-43aa-4368-a33f-50b182c1faf8","Type":"ContainerStarted","Data":"8401aff6232138dc2f203c465dd066ad0cfd0b6abd4f01e0027a5b2423b4b56c"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.376705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pkmrj" event={"ID":"600b2e52-def7-4084-982d-5ccef14d35fd","Type":"ContainerStarted","Data":"2fb09a216cdc3994f100acf5555012c5fe650752d985e5588d408afd4fdcc853"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.376747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pkmrj" event={"ID":"600b2e52-def7-4084-982d-5ccef14d35fd","Type":"ContainerStarted","Data":"39bb2550678b0a9945fbdb6f051d020aa10b20b59fc85e922e8473bf5fa190c2"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.378799 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3133-account-create-update-2jnjh" event={"ID":"698c1d5e-41e8-4701-b3d3-81b015482ff1","Type":"ContainerStarted","Data":"daa7582b9598a4c07093742df2d8fc40cf921718dedd4cc8a0238a8e0cf03fdc"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.379914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" event={"ID":"aceb571f-95c2-46b6-a84a-fdd448fdc167","Type":"ContainerStarted","Data":"d86a71342d4acbd4d7800637011f0381b25e291c09a8e5c29f0ea02461317b6c"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.381131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6j6q9" event={"ID":"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4","Type":"ContainerStarted","Data":"9a102987ecb1b4f4610dc425d36dc302b036f8565c8c1878c34fa32c877d3417"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.387160 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"52f18846b01e24199129740307034c3399b211518441bfbb288b01a30b337228"} Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.397485 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-58de-account-create-update-nxbc7" podStartSLOduration=2.397464175 podStartE2EDuration="2.397464175s" podCreationTimestamp="2025-12-03 22:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:14.390094775 +0000 UTC m=+1263.386556124" watchObservedRunningTime="2025-12-03 22:26:14.397464175 +0000 UTC m=+1263.393925524" Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.411161 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-pkmrj" podStartSLOduration=2.411138627 podStartE2EDuration="2.411138627s" podCreationTimestamp="2025-12-03 22:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:14.40427366 +0000 UTC m=+1263.400735009" watchObservedRunningTime="2025-12-03 22:26:14.411138627 +0000 UTC m=+1263.407599996" Dec 03 22:26:14 crc kubenswrapper[4830]: I1203 22:26:14.432727 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-qrsbk" podStartSLOduration=2.432701323 podStartE2EDuration="2.432701323s" podCreationTimestamp="2025-12-03 22:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:14.432544399 +0000 UTC m=+1263.429005748" watchObservedRunningTime="2025-12-03 22:26:14.432701323 +0000 UTC m=+1263.429162712" Dec 03 22:26:15 crc kubenswrapper[4830]: I1203 22:26:15.982234 4830 generic.go:334] "Generic (PLEG): container finished" podID="698c1d5e-41e8-4701-b3d3-81b015482ff1" containerID="fdfe759c0a5830ae7a8b9cb69bde7685d6ffd3f470eac868625a72890e05553e" exitCode=0 Dec 03 22:26:15 crc kubenswrapper[4830]: I1203 22:26:15.989691 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Dec 03 22:26:15 crc kubenswrapper[4830]: I1203 22:26:15.989727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3133-account-create-update-2jnjh" event={"ID":"698c1d5e-41e8-4701-b3d3-81b015482ff1","Type":"ContainerDied","Data":"fdfe759c0a5830ae7a8b9cb69bde7685d6ffd3f470eac868625a72890e05553e"} Dec 03 22:26:15 crc kubenswrapper[4830]: I1203 22:26:15.998788 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e","Type":"ContainerStarted","Data":"dc5151a8c2b78c9e0cc9f96508df98d46f48eba3f1c29df76f627886cb101b82"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.014052 4830 generic.go:334] "Generic (PLEG): container finished" podID="86b4b71c-3ed3-4413-a90b-4523b4a5c549" containerID="11c152810e3c9c9f455e71a9a36ff1ed2fb6cd873d9093b71253631afff9ef4c" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.014082 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerDied","Data":"11c152810e3c9c9f455e71a9a36ff1ed2fb6cd873d9093b71253631afff9ef4c"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.015805 4830 generic.go:334] "Generic (PLEG): container finished" podID="15003660-6e4d-427f-8386-f4be36fc25f8" containerID="11b0655835c105a17984c97cf10b835f59090637aeb762ab8ee59130236101c1" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.015841 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdgf6" event={"ID":"15003660-6e4d-427f-8386-f4be36fc25f8","Type":"ContainerDied","Data":"11b0655835c105a17984c97cf10b835f59090637aeb762ab8ee59130236101c1"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.017993 4830 generic.go:334] "Generic (PLEG): container finished" podID="ece3f713-2d71-4b1f-a20c-504f9e2dec24" containerID="1b348f93f5e718fd1311fcce5b4054d2a45bafdbfb9837350e4b193e50f7aeb7" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.018032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58de-account-create-update-nxbc7" event={"ID":"ece3f713-2d71-4b1f-a20c-504f9e2dec24","Type":"ContainerDied","Data":"1b348f93f5e718fd1311fcce5b4054d2a45bafdbfb9837350e4b193e50f7aeb7"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.019870 4830 generic.go:334] "Generic (PLEG): container finished" podID="2fd0383c-30e5-4787-9d20-fe1c5de32b62" containerID="0ed64c44fe8b23d64eec54eb01a7a7bced72448258e02568a28dd674e97f77e0" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.019913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-qrsbk" event={"ID":"2fd0383c-30e5-4787-9d20-fe1c5de32b62","Type":"ContainerDied","Data":"0ed64c44fe8b23d64eec54eb01a7a7bced72448258e02568a28dd674e97f77e0"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.021283 4830 generic.go:334] "Generic (PLEG): container finished" podID="aceb571f-95c2-46b6-a84a-fdd448fdc167" containerID="ef9b41d81a3416d5b15a18e15ae8a2717234d9af53878c2952084f7c9a016e31" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.021321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" event={"ID":"aceb571f-95c2-46b6-a84a-fdd448fdc167","Type":"ContainerDied","Data":"ef9b41d81a3416d5b15a18e15ae8a2717234d9af53878c2952084f7c9a016e31"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.040797 4830 generic.go:334] "Generic (PLEG): container finished" podID="600b2e52-def7-4084-982d-5ccef14d35fd" containerID="2fb09a216cdc3994f100acf5555012c5fe650752d985e5588d408afd4fdcc853" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.040872 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pkmrj" event={"ID":"600b2e52-def7-4084-982d-5ccef14d35fd","Type":"ContainerDied","Data":"2fb09a216cdc3994f100acf5555012c5fe650752d985e5588d408afd4fdcc853"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.053193 4830 generic.go:334] "Generic (PLEG): container finished" podID="5c30a279-b67f-46c3-980e-e38b9cc27eb9" containerID="3a84c594a8dcde177c39dc868c2b7ee8b8294d617e388f110d43ee4af4f7e69f" exitCode=0 Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.053417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f6b-account-create-update-sbkkw" event={"ID":"5c30a279-b67f-46c3-980e-e38b9cc27eb9","Type":"ContainerDied","Data":"3a84c594a8dcde177c39dc868c2b7ee8b8294d617e388f110d43ee4af4f7e69f"} Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.069623 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.375136124 podStartE2EDuration="51.069602115s" podCreationTimestamp="2025-12-03 22:25:25 +0000 UTC" firstStartedPulling="2025-12-03 22:26:04.788727617 +0000 UTC m=+1253.785188966" lastFinishedPulling="2025-12-03 22:26:11.483193598 +0000 UTC m=+1260.479654957" observedRunningTime="2025-12-03 22:26:16.068538886 +0000 UTC m=+1265.065000235" watchObservedRunningTime="2025-12-03 22:26:16.069602115 +0000 UTC m=+1265.066063464" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.464998 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.467147 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.480177 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.504571 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571323 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571416 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.571726 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj4fh\" (UniqueName: \"kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.614998 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673381 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673457 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj4fh\" (UniqueName: \"kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673536 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673581 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.673613 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.674297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.674361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.674619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.674855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.674933 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.688749 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj4fh\" (UniqueName: \"kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh\") pod \"dnsmasq-dns-764c5664d7-8kgxk\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.774830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts\") pod \"5ba568a6-43aa-4368-a33f-50b182c1faf8\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.774970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7bjj\" (UniqueName: \"kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj\") pod \"5ba568a6-43aa-4368-a33f-50b182c1faf8\" (UID: \"5ba568a6-43aa-4368-a33f-50b182c1faf8\") " Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.776151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ba568a6-43aa-4368-a33f-50b182c1faf8" (UID: "5ba568a6-43aa-4368-a33f-50b182c1faf8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.778350 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj" (OuterVolumeSpecName: "kube-api-access-t7bjj") pod "5ba568a6-43aa-4368-a33f-50b182c1faf8" (UID: "5ba568a6-43aa-4368-a33f-50b182c1faf8"). InnerVolumeSpecName "kube-api-access-t7bjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.832849 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.877489 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7bjj\" (UniqueName: \"kubernetes.io/projected/5ba568a6-43aa-4368-a33f-50b182c1faf8-kube-api-access-t7bjj\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:16 crc kubenswrapper[4830]: I1203 22:26:16.877546 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ba568a6-43aa-4368-a33f-50b182c1faf8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.066149 4830 generic.go:334] "Generic (PLEG): container finished" podID="b445f62c-c9c8-488f-ad3c-4fd162cb1092" containerID="dfa4d1d9a173f4260396b1f874bc8f68e8891bf73bc1bc8071a97e792dab79ad" exitCode=0 Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.066292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-75hs2" event={"ID":"b445f62c-c9c8-488f-ad3c-4fd162cb1092","Type":"ContainerDied","Data":"dfa4d1d9a173f4260396b1f874bc8f68e8891bf73bc1bc8071a97e792dab79ad"} Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.071124 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w9fdb" event={"ID":"5ba568a6-43aa-4368-a33f-50b182c1faf8","Type":"ContainerDied","Data":"8401aff6232138dc2f203c465dd066ad0cfd0b6abd4f01e0027a5b2423b4b56c"} Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.071156 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8401aff6232138dc2f203c465dd066ad0cfd0b6abd4f01e0027a5b2423b4b56c" Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.071202 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w9fdb" Dec 03 22:26:17 crc kubenswrapper[4830]: I1203 22:26:17.076044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerStarted","Data":"9b466721cbca689036ef75e3ee562c89659971264b674c7dfad053544f121e1d"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.000055 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.030029 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.047359 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.052486 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.069855 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.077905 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-75hs2" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.082980 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.123298 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.131039 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-75hs2" event={"ID":"b445f62c-c9c8-488f-ad3c-4fd162cb1092","Type":"ContainerDied","Data":"8a2590b8e9824347c7a007f48cc03a7c09dcc896105ddaeff6d7fbfb4159b6cc"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.131076 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a2590b8e9824347c7a007f48cc03a7c09dcc896105ddaeff6d7fbfb4159b6cc" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.131196 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-75hs2" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.138218 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-58de-account-create-update-nxbc7" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.138481 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-58de-account-create-update-nxbc7" event={"ID":"ece3f713-2d71-4b1f-a20c-504f9e2dec24","Type":"ContainerDied","Data":"51c427da3f2c060e1ea24e1fc728e1c9e40a7447a3696e5d95adbacdbd9a47dd"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.138544 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c427da3f2c060e1ea24e1fc728e1c9e40a7447a3696e5d95adbacdbd9a47dd" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.141271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-qrsbk" event={"ID":"2fd0383c-30e5-4787-9d20-fe1c5de32b62","Type":"ContainerDied","Data":"690f50ffbf9b781a99846ed77cb5d39504d78ac46df77abdc2c5761dd6ffc6ed"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.141295 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690f50ffbf9b781a99846ed77cb5d39504d78ac46df77abdc2c5761dd6ffc6ed" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.141303 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-qrsbk" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.149869 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3133-account-create-update-2jnjh" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.149948 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3133-account-create-update-2jnjh" event={"ID":"698c1d5e-41e8-4701-b3d3-81b015482ff1","Type":"ContainerDied","Data":"daa7582b9598a4c07093742df2d8fc40cf921718dedd4cc8a0238a8e0cf03fdc"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.149981 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa7582b9598a4c07093742df2d8fc40cf921718dedd4cc8a0238a8e0cf03fdc" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154050 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts\") pod \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154155 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9pgz\" (UniqueName: \"kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz\") pod \"15003660-6e4d-427f-8386-f4be36fc25f8\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154190 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data\") pod \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154229 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts\") pod \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154254 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvct2\" (UniqueName: \"kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2\") pod \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154273 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jmt\" (UniqueName: \"kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt\") pod \"698c1d5e-41e8-4701-b3d3-81b015482ff1\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154301 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts\") pod \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154383 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data\") pod \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154401 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts\") pod \"698c1d5e-41e8-4701-b3d3-81b015482ff1\" (UID: \"698c1d5e-41e8-4701-b3d3-81b015482ff1\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154463 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxszk\" (UniqueName: \"kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk\") pod \"600b2e52-def7-4084-982d-5ccef14d35fd\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154491 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle\") pod \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\" (UID: \"b445f62c-c9c8-488f-ad3c-4fd162cb1092\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154558 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfxb\" (UniqueName: \"kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb\") pod \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\" (UID: \"5c30a279-b67f-46c3-980e-e38b9cc27eb9\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154591 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4m5m\" (UniqueName: \"kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m\") pod \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\" (UID: \"ece3f713-2d71-4b1f-a20c-504f9e2dec24\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154616 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts\") pod \"600b2e52-def7-4084-982d-5ccef14d35fd\" (UID: \"600b2e52-def7-4084-982d-5ccef14d35fd\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltspv\" (UniqueName: \"kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv\") pod \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\" (UID: \"2fd0383c-30e5-4787-9d20-fe1c5de32b62\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.154658 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts\") pod \"15003660-6e4d-427f-8386-f4be36fc25f8\" (UID: \"15003660-6e4d-427f-8386-f4be36fc25f8\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.157191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c30a279-b67f-46c3-980e-e38b9cc27eb9" (UID: "5c30a279-b67f-46c3-980e-e38b9cc27eb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.158779 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ece3f713-2d71-4b1f-a20c-504f9e2dec24" (UID: "ece3f713-2d71-4b1f-a20c-504f9e2dec24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.159460 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2fd0383c-30e5-4787-9d20-fe1c5de32b62" (UID: "2fd0383c-30e5-4787-9d20-fe1c5de32b62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.162987 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m" (OuterVolumeSpecName: "kube-api-access-c4m5m") pod "ece3f713-2d71-4b1f-a20c-504f9e2dec24" (UID: "ece3f713-2d71-4b1f-a20c-504f9e2dec24"). InnerVolumeSpecName "kube-api-access-c4m5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.163745 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk" (OuterVolumeSpecName: "kube-api-access-jxszk") pod "600b2e52-def7-4084-982d-5ccef14d35fd" (UID: "600b2e52-def7-4084-982d-5ccef14d35fd"). InnerVolumeSpecName "kube-api-access-jxszk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.165065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "600b2e52-def7-4084-982d-5ccef14d35fd" (UID: "600b2e52-def7-4084-982d-5ccef14d35fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.165253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "698c1d5e-41e8-4701-b3d3-81b015482ff1" (UID: "698c1d5e-41e8-4701-b3d3-81b015482ff1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.166008 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15003660-6e4d-427f-8386-f4be36fc25f8" (UID: "15003660-6e4d-427f-8386-f4be36fc25f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.166221 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd0383c-30e5-4787-9d20-fe1c5de32b62-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.166308 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c30a279-b67f-46c3-980e-e38b9cc27eb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.166675 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece3f713-2d71-4b1f-a20c-504f9e2dec24-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.173431 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb" (OuterVolumeSpecName: "kube-api-access-nwfxb") pod "5c30a279-b67f-46c3-980e-e38b9cc27eb9" (UID: "5c30a279-b67f-46c3-980e-e38b9cc27eb9"). InnerVolumeSpecName "kube-api-access-nwfxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.174134 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2" (OuterVolumeSpecName: "kube-api-access-zvct2") pod "b445f62c-c9c8-488f-ad3c-4fd162cb1092" (UID: "b445f62c-c9c8-488f-ad3c-4fd162cb1092"). InnerVolumeSpecName "kube-api-access-zvct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.175080 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b445f62c-c9c8-488f-ad3c-4fd162cb1092" (UID: "b445f62c-c9c8-488f-ad3c-4fd162cb1092"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.174498 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz" (OuterVolumeSpecName: "kube-api-access-b9pgz") pod "15003660-6e4d-427f-8386-f4be36fc25f8" (UID: "15003660-6e4d-427f-8386-f4be36fc25f8"). InnerVolumeSpecName "kube-api-access-b9pgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.176981 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerStarted","Data":"00a2755bca85017879668e3b3901068995d561fdb92a80bd6f20b5ce6df9a9a1"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.178409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"86b4b71c-3ed3-4413-a90b-4523b4a5c549","Type":"ContainerStarted","Data":"02902fecd36be7ad924940c3445100d6608e69f46f2aed176f1d011b903bf423"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.189071 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pkmrj" event={"ID":"600b2e52-def7-4084-982d-5ccef14d35fd","Type":"ContainerDied","Data":"39bb2550678b0a9945fbdb6f051d020aa10b20b59fc85e922e8473bf5fa190c2"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.189127 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39bb2550678b0a9945fbdb6f051d020aa10b20b59fc85e922e8473bf5fa190c2" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.189266 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pkmrj" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.192177 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdgf6" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.192197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdgf6" event={"ID":"15003660-6e4d-427f-8386-f4be36fc25f8","Type":"ContainerDied","Data":"c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.192247 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c63e41502a2bf7f78fdfaf2e99c64c604d9911215e1760e9d86b29cd87168960" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.194549 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3f6b-account-create-update-sbkkw" event={"ID":"5c30a279-b67f-46c3-980e-e38b9cc27eb9","Type":"ContainerDied","Data":"a67d4a13642504a6a913a61fdccdf71cbeb6c8ec64bfd465a6644d8d385e8e6b"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.194587 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67d4a13642504a6a913a61fdccdf71cbeb6c8ec64bfd465a6644d8d385e8e6b" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.194657 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3f6b-account-create-update-sbkkw" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.202761 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt" (OuterVolumeSpecName: "kube-api-access-92jmt") pod "698c1d5e-41e8-4701-b3d3-81b015482ff1" (UID: "698c1d5e-41e8-4701-b3d3-81b015482ff1"). InnerVolumeSpecName "kube-api-access-92jmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.207472 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" event={"ID":"aceb571f-95c2-46b6-a84a-fdd448fdc167","Type":"ContainerDied","Data":"d86a71342d4acbd4d7800637011f0381b25e291c09a8e5c29f0ea02461317b6c"} Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.207537 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d86a71342d4acbd4d7800637011f0381b25e291c09a8e5c29f0ea02461317b6c" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.207602 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-aa18-account-create-update-l7lnv" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.210647 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b445f62c-c9c8-488f-ad3c-4fd162cb1092" (UID: "b445f62c-c9c8-488f-ad3c-4fd162cb1092"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.222316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv" (OuterVolumeSpecName: "kube-api-access-ltspv") pod "2fd0383c-30e5-4787-9d20-fe1c5de32b62" (UID: "2fd0383c-30e5-4787-9d20-fe1c5de32b62"). InnerVolumeSpecName "kube-api-access-ltspv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.244229 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.244204134 podStartE2EDuration="15.244204134s" podCreationTimestamp="2025-12-03 22:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:20.21794971 +0000 UTC m=+1269.214411059" watchObservedRunningTime="2025-12-03 22:26:20.244204134 +0000 UTC m=+1269.240665483" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.256364 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6j6q9" podStartSLOduration=2.373072833 podStartE2EDuration="8.256346245s" podCreationTimestamp="2025-12-03 22:26:12 +0000 UTC" firstStartedPulling="2025-12-03 22:26:13.944677033 +0000 UTC m=+1262.941138382" lastFinishedPulling="2025-12-03 22:26:19.827950405 +0000 UTC m=+1268.824411794" observedRunningTime="2025-12-03 22:26:20.235559449 +0000 UTC m=+1269.232020818" watchObservedRunningTime="2025-12-03 22:26:20.256346245 +0000 UTC m=+1269.252807594" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.268131 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45q48\" (UniqueName: \"kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48\") pod \"aceb571f-95c2-46b6-a84a-fdd448fdc167\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.268266 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts\") pod \"aceb571f-95c2-46b6-a84a-fdd448fdc167\" (UID: \"aceb571f-95c2-46b6-a84a-fdd448fdc167\") " Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.268791 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aceb571f-95c2-46b6-a84a-fdd448fdc167" (UID: "aceb571f-95c2-46b6-a84a-fdd448fdc167"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269001 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9pgz\" (UniqueName: \"kubernetes.io/projected/15003660-6e4d-427f-8386-f4be36fc25f8-kube-api-access-b9pgz\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269019 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269029 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aceb571f-95c2-46b6-a84a-fdd448fdc167-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269039 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvct2\" (UniqueName: \"kubernetes.io/projected/b445f62c-c9c8-488f-ad3c-4fd162cb1092-kube-api-access-zvct2\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269048 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jmt\" (UniqueName: \"kubernetes.io/projected/698c1d5e-41e8-4701-b3d3-81b015482ff1-kube-api-access-92jmt\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269056 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698c1d5e-41e8-4701-b3d3-81b015482ff1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269065 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxszk\" (UniqueName: \"kubernetes.io/projected/600b2e52-def7-4084-982d-5ccef14d35fd-kube-api-access-jxszk\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269073 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269083 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfxb\" (UniqueName: \"kubernetes.io/projected/5c30a279-b67f-46c3-980e-e38b9cc27eb9-kube-api-access-nwfxb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269091 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4m5m\" (UniqueName: \"kubernetes.io/projected/ece3f713-2d71-4b1f-a20c-504f9e2dec24-kube-api-access-c4m5m\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269154 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/600b2e52-def7-4084-982d-5ccef14d35fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269163 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltspv\" (UniqueName: \"kubernetes.io/projected/2fd0383c-30e5-4787-9d20-fe1c5de32b62-kube-api-access-ltspv\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.269171 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15003660-6e4d-427f-8386-f4be36fc25f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.278751 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data" (OuterVolumeSpecName: "config-data") pod "b445f62c-c9c8-488f-ad3c-4fd162cb1092" (UID: "b445f62c-c9c8-488f-ad3c-4fd162cb1092"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.288752 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48" (OuterVolumeSpecName: "kube-api-access-45q48") pod "aceb571f-95c2-46b6-a84a-fdd448fdc167" (UID: "aceb571f-95c2-46b6-a84a-fdd448fdc167"). InnerVolumeSpecName "kube-api-access-45q48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.330670 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:20 crc kubenswrapper[4830]: W1203 22:26:20.335260 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cbc9888_4a88_4d22_a914_99d23eefb4cd.slice/crio-28676c38ac081395fa26907f19d3a059bfcd8575ed387b656348f809e09cf33e WatchSource:0}: Error finding container 28676c38ac081395fa26907f19d3a059bfcd8575ed387b656348f809e09cf33e: Status 404 returned error can't find the container with id 28676c38ac081395fa26907f19d3a059bfcd8575ed387b656348f809e09cf33e Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.370319 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45q48\" (UniqueName: \"kubernetes.io/projected/aceb571f-95c2-46b6-a84a-fdd448fdc167-kube-api-access-45q48\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.370342 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b445f62c-c9c8-488f-ad3c-4fd162cb1092-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.679966 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.680009 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:20 crc kubenswrapper[4830]: I1203 22:26:20.693185 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.218616 4830 generic.go:334] "Generic (PLEG): container finished" podID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerID="e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d" exitCode=0 Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.218719 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" event={"ID":"9cbc9888-4a88-4d22-a914-99d23eefb4cd","Type":"ContainerDied","Data":"e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d"} Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.220412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" event={"ID":"9cbc9888-4a88-4d22-a914-99d23eefb4cd","Type":"ContainerStarted","Data":"28676c38ac081395fa26907f19d3a059bfcd8575ed387b656348f809e09cf33e"} Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.224552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6j6q9" event={"ID":"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4","Type":"ContainerStarted","Data":"f43cb1a53606b67fa1a754ad6d6b456f1bf7339c9f781128e798c0ac6f3f4098"} Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.230222 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.625421 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699315 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699702 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698c1d5e-41e8-4701-b3d3-81b015482ff1" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699719 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="698c1d5e-41e8-4701-b3d3-81b015482ff1" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699740 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd0383c-30e5-4787-9d20-fe1c5de32b62" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699746 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd0383c-30e5-4787-9d20-fe1c5de32b62" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699757 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b445f62c-c9c8-488f-ad3c-4fd162cb1092" containerName="glance-db-sync" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699764 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b445f62c-c9c8-488f-ad3c-4fd162cb1092" containerName="glance-db-sync" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699780 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15003660-6e4d-427f-8386-f4be36fc25f8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699786 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="15003660-6e4d-427f-8386-f4be36fc25f8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699802 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c30a279-b67f-46c3-980e-e38b9cc27eb9" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699807 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c30a279-b67f-46c3-980e-e38b9cc27eb9" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699818 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aceb571f-95c2-46b6-a84a-fdd448fdc167" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699824 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aceb571f-95c2-46b6-a84a-fdd448fdc167" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699839 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece3f713-2d71-4b1f-a20c-504f9e2dec24" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699845 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece3f713-2d71-4b1f-a20c-504f9e2dec24" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699853 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600b2e52-def7-4084-982d-5ccef14d35fd" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699859 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="600b2e52-def7-4084-982d-5ccef14d35fd" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: E1203 22:26:21.699867 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba568a6-43aa-4368-a33f-50b182c1faf8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.699873 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba568a6-43aa-4368-a33f-50b182c1faf8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700038 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba568a6-43aa-4368-a33f-50b182c1faf8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700062 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b445f62c-c9c8-488f-ad3c-4fd162cb1092" containerName="glance-db-sync" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700073 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aceb571f-95c2-46b6-a84a-fdd448fdc167" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700088 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd0383c-30e5-4787-9d20-fe1c5de32b62" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700096 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="600b2e52-def7-4084-982d-5ccef14d35fd" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700105 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="15003660-6e4d-427f-8386-f4be36fc25f8" containerName="mariadb-database-create" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700115 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece3f713-2d71-4b1f-a20c-504f9e2dec24" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700130 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c30a279-b67f-46c3-980e-e38b9cc27eb9" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.700139 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="698c1d5e-41e8-4701-b3d3-81b015482ff1" containerName="mariadb-account-create-update" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.701077 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.795036 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.825562 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.825601 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.825623 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.826333 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.826384 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45rn\" (UniqueName: \"kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.826426 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.930768 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.930820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.930849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.930943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.930985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k45rn\" (UniqueName: \"kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.931019 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.931963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.936495 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.936825 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.937648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.947840 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:21 crc kubenswrapper[4830]: I1203 22:26:21.971060 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45rn\" (UniqueName: \"kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn\") pod \"dnsmasq-dns-74f6bcbc87-tl8d8\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:22 crc kubenswrapper[4830]: I1203 22:26:22.018697 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:22 crc kubenswrapper[4830]: I1203 22:26:22.257754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" event={"ID":"9cbc9888-4a88-4d22-a914-99d23eefb4cd","Type":"ContainerStarted","Data":"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf"} Dec 03 22:26:22 crc kubenswrapper[4830]: I1203 22:26:22.304007 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" podStartSLOduration=6.3039876360000004 podStartE2EDuration="6.303987636s" podCreationTimestamp="2025-12-03 22:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:22.282583083 +0000 UTC m=+1271.279044432" watchObservedRunningTime="2025-12-03 22:26:22.303987636 +0000 UTC m=+1271.300448985" Dec 03 22:26:22 crc kubenswrapper[4830]: I1203 22:26:22.501516 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:22 crc kubenswrapper[4830]: W1203 22:26:22.512771 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a49f854_d439_4cc8_8183_84f0b5c6a0b1.slice/crio-3ed26745d63fe32cc6e166362e8e5066f5644fbd0814d3d2c7677dd830ab5359 WatchSource:0}: Error finding container 3ed26745d63fe32cc6e166362e8e5066f5644fbd0814d3d2c7677dd830ab5359: Status 404 returned error can't find the container with id 3ed26745d63fe32cc6e166362e8e5066f5644fbd0814d3d2c7677dd830ab5359 Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.269121 4830 generic.go:334] "Generic (PLEG): container finished" podID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerID="d5815704f76914b486b6a708a7fb1c30c7b64e4af21e6d3386120c518c0de336" exitCode=0 Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.269203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" event={"ID":"1a49f854-d439-4cc8-8183-84f0b5c6a0b1","Type":"ContainerDied","Data":"d5815704f76914b486b6a708a7fb1c30c7b64e4af21e6d3386120c518c0de336"} Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.270404 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" event={"ID":"1a49f854-d439-4cc8-8183-84f0b5c6a0b1","Type":"ContainerStarted","Data":"3ed26745d63fe32cc6e166362e8e5066f5644fbd0814d3d2c7677dd830ab5359"} Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.270568 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="dnsmasq-dns" containerID="cri-o://62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf" gracePeriod=10 Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.270880 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.654095 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764525 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj4fh\" (UniqueName: \"kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764569 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764690 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764755 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764810 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.764877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc\") pod \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\" (UID: \"9cbc9888-4a88-4d22-a914-99d23eefb4cd\") " Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.770141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh" (OuterVolumeSpecName: "kube-api-access-xj4fh") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "kube-api-access-xj4fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.809582 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.812842 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.820984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config" (OuterVolumeSpecName: "config") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.821459 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.828248 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9cbc9888-4a88-4d22-a914-99d23eefb4cd" (UID: "9cbc9888-4a88-4d22-a914-99d23eefb4cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.866945 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj4fh\" (UniqueName: \"kubernetes.io/projected/9cbc9888-4a88-4d22-a914-99d23eefb4cd-kube-api-access-xj4fh\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.866973 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.866983 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.866993 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.867001 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:23 crc kubenswrapper[4830]: I1203 22:26:23.867011 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbc9888-4a88-4d22-a914-99d23eefb4cd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.282774 4830 generic.go:334] "Generic (PLEG): container finished" podID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerID="62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf" exitCode=0 Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.282854 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.282878 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" event={"ID":"9cbc9888-4a88-4d22-a914-99d23eefb4cd","Type":"ContainerDied","Data":"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf"} Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.283571 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-8kgxk" event={"ID":"9cbc9888-4a88-4d22-a914-99d23eefb4cd","Type":"ContainerDied","Data":"28676c38ac081395fa26907f19d3a059bfcd8575ed387b656348f809e09cf33e"} Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.283622 4830 scope.go:117] "RemoveContainer" containerID="62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.286631 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" event={"ID":"1a49f854-d439-4cc8-8183-84f0b5c6a0b1","Type":"ContainerStarted","Data":"f87921027bd0ee6986522a671cbbec41dce3432f40ce134e0c856c3942e1b174"} Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.286788 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.289685 4830 generic.go:334] "Generic (PLEG): container finished" podID="2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" containerID="f43cb1a53606b67fa1a754ad6d6b456f1bf7339c9f781128e798c0ac6f3f4098" exitCode=0 Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.289730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6j6q9" event={"ID":"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4","Type":"ContainerDied","Data":"f43cb1a53606b67fa1a754ad6d6b456f1bf7339c9f781128e798c0ac6f3f4098"} Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.319718 4830 scope.go:117] "RemoveContainer" containerID="e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.333428 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" podStartSLOduration=3.333413361 podStartE2EDuration="3.333413361s" podCreationTimestamp="2025-12-03 22:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:24.325583728 +0000 UTC m=+1273.322045087" watchObservedRunningTime="2025-12-03 22:26:24.333413361 +0000 UTC m=+1273.329874710" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.359076 4830 scope.go:117] "RemoveContainer" containerID="62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf" Dec 03 22:26:24 crc kubenswrapper[4830]: E1203 22:26:24.360071 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf\": container with ID starting with 62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf not found: ID does not exist" containerID="62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.360178 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf"} err="failed to get container status \"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf\": rpc error: code = NotFound desc = could not find container \"62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf\": container with ID starting with 62acfeb36a52458bee0986786756f23cf3377ff0ecba189ffe774b12d28db7bf not found: ID does not exist" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.360225 4830 scope.go:117] "RemoveContainer" containerID="e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d" Dec 03 22:26:24 crc kubenswrapper[4830]: E1203 22:26:24.360679 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d\": container with ID starting with e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d not found: ID does not exist" containerID="e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.360719 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d"} err="failed to get container status \"e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d\": rpc error: code = NotFound desc = could not find container \"e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d\": container with ID starting with e1d34fa9788a63c17c33812209cf3bc33c240f156e6e0ce666d77ce1f5c2875d not found: ID does not exist" Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.393896 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:24 crc kubenswrapper[4830]: I1203 22:26:24.406605 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-8kgxk"] Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.359586 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" path="/var/lib/kubelet/pods/9cbc9888-4a88-4d22-a914-99d23eefb4cd/volumes" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.739042 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.797061 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zds78\" (UniqueName: \"kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78\") pod \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.797154 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle\") pod \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.797191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data\") pod \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\" (UID: \"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4\") " Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.811063 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78" (OuterVolumeSpecName: "kube-api-access-zds78") pod "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" (UID: "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4"). InnerVolumeSpecName "kube-api-access-zds78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.836974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" (UID: "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.855380 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data" (OuterVolumeSpecName: "config-data") pod "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" (UID: "2a872d21-c6ee-47bc-a31e-a529bd4e2ff4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.898923 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zds78\" (UniqueName: \"kubernetes.io/projected/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-kube-api-access-zds78\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.899121 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:25 crc kubenswrapper[4830]: I1203 22:26:25.899195 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.312296 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6j6q9" event={"ID":"2a872d21-c6ee-47bc-a31e-a529bd4e2ff4","Type":"ContainerDied","Data":"9a102987ecb1b4f4610dc425d36dc302b036f8565c8c1878c34fa32c877d3417"} Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.312342 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a102987ecb1b4f4610dc425d36dc302b036f8565c8c1878c34fa32c877d3417" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.312401 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6j6q9" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.695417 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qdm4z"] Dec 03 22:26:26 crc kubenswrapper[4830]: E1203 22:26:26.700107 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" containerName="keystone-db-sync" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.700132 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" containerName="keystone-db-sync" Dec 03 22:26:26 crc kubenswrapper[4830]: E1203 22:26:26.700332 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="init" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.700345 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="init" Dec 03 22:26:26 crc kubenswrapper[4830]: E1203 22:26:26.700364 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="dnsmasq-dns" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.700370 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="dnsmasq-dns" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.700909 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" containerName="keystone-db-sync" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.700947 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbc9888-4a88-4d22-a914-99d23eefb4cd" containerName="dnsmasq-dns" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.701610 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.712177 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.712955 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.713208 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.713385 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6cfjx" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.713620 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.728368 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qdm4z"] Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.783609 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.783811 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="dnsmasq-dns" containerID="cri-o://f87921027bd0ee6986522a671cbbec41dce3432f40ce134e0c856c3942e1b174" gracePeriod=10 Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.816803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.816989 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.817033 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.817193 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.817245 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrt8\" (UniqueName: \"kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.817265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.842305 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.843832 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.879023 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.918950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919086 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919114 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vlzb\" (UniqueName: \"kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919182 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919220 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919302 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrt8\" (UniqueName: \"kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.919340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.924253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.924581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.925840 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.930147 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.932075 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:26 crc kubenswrapper[4830]: I1203 22:26:26.992875 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrt8\" (UniqueName: \"kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8\") pod \"keystone-bootstrap-qdm4z\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.022728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.023016 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.023183 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.023263 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.023366 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vlzb\" (UniqueName: \"kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.023488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.024532 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.025130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.025698 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.026266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.026819 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.028791 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.092479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vlzb\" (UniqueName: \"kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb\") pod \"dnsmasq-dns-847c4cc679-kbt8k\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.102461 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.113847 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.121805 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.122017 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.159051 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.173951 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.220292 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-z5lqr"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.221412 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227614 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg247\" (UniqueName: \"kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227728 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227773 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.227829 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.257960 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-67qfl"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.258016 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lllk2" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.260019 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.258062 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.258114 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.284746 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-67qfl"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.302934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.303551 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.304093 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pzjnl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.304333 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346129 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346177 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgmn\" (UniqueName: \"kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346284 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk946\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346355 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346558 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346583 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346678 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg247\" (UniqueName: \"kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346724 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346769 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346788 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.346809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.347335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.353181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.373574 4830 generic.go:334] "Generic (PLEG): container finished" podID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerID="f87921027bd0ee6986522a671cbbec41dce3432f40ce134e0c856c3942e1b174" exitCode=0 Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.376689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.387276 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.400247 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.407599 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.439492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg247\" (UniqueName: \"kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247\") pod \"ceilometer-0\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk946\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448631 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448642 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448867 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448905 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.448991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.449020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgmn\" (UniqueName: \"kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.454581 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" event={"ID":"1a49f854-d439-4cc8-8183-84f0b5c6a0b1","Type":"ContainerDied","Data":"f87921027bd0ee6986522a671cbbec41dce3432f40ce134e0c856c3942e1b174"} Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.454844 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4g65n"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.462680 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-z5lqr"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.462729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4g65n"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.462823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.465681 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.481003 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m9jgq" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.483814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.484617 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.484887 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.489978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.490344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.490574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgmn\" (UniqueName: \"kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.491098 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.493394 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.493636 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data\") pod \"cinder-db-sync-z5lqr\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.494994 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-94wsw"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.496398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.497285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.501319 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.513805 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n2pqj" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.514006 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.522791 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk946\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946\") pod \"cloudkitty-db-sync-67qfl\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.540757 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-94wsw"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550895 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550921 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnqw\" (UniqueName: \"kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550944 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgs9l\" (UniqueName: \"kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.550959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.574666 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.591735 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rjqfw"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.600060 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.611121 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vcrcr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.616150 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.616366 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.622254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.632680 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rjqfw"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652369 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652419 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnqw\" (UniqueName: \"kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66w7\" (UniqueName: \"kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652474 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgs9l\" (UniqueName: \"kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652494 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652545 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652672 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.652744 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.656125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.658673 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.660579 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.664374 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.668484 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.677851 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.680848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.685328 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.693112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgs9l\" (UniqueName: \"kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l\") pod \"barbican-db-sync-94wsw\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.698011 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnqw\" (UniqueName: \"kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw\") pod \"neutron-db-sync-4g65n\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754214 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66w7\" (UniqueName: \"kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754295 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754336 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754384 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754481 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq7z8\" (UniqueName: \"kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.757081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.754499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.758774 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.762025 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.772540 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.772826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.804779 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66w7\" (UniqueName: \"kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7\") pod \"placement-db-sync-rjqfw\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.836045 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.838673 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.843904 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.845718 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.845925 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.846011 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hc559" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.855594 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4g65n" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq7z8\" (UniqueName: \"kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861719 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.861795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.875457 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.886518 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq7z8\" (UniqueName: \"kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.923960 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-94wsw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.941288 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjqfw" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.956060 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.957738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.958132 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.961185 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964011 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964090 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964115 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdhkh\" (UniqueName: \"kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964285 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964316 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.964368 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:27 crc kubenswrapper[4830]: I1203 22:26:27.965787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kfvth\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.009459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.066635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.066991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdhkh\" (UniqueName: \"kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067110 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067192 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067274 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067525 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.067655 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.075800 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.077911 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.093831 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.093907 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb08e8461f959e86adb7412ffdaebbf2e0d85ec485d90c5b01024a06c9055cd6/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.095608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.101664 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.103215 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.105936 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.109587 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdhkh\" (UniqueName: \"kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.121667 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.124065 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.136816 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.169789 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.204214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.219986 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.237691 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qdm4z"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.255197 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.274672 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.274733 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nscvv\" (UniqueName: \"kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.274909 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.274979 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.275041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.275218 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.275765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.275804 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.298033 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.385826 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.385909 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386032 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k45rn\" (UniqueName: \"kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386101 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386273 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386316 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0\") pod \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\" (UID: \"1a49f854-d439-4cc8-8183-84f0b5c6a0b1\") " Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386758 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386827 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386889 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386915 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386949 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.386996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nscvv\" (UniqueName: \"kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.388277 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.391347 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qdm4z" event={"ID":"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94","Type":"ContainerStarted","Data":"4c2688c657ba1b7d8742f81989aebd098cdbfd4f72ae549bf4a4c1fbc5a7baba"} Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.391772 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.394014 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.397640 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.399843 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.401478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nscvv\" (UniqueName: \"kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.402857 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.406772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn" (OuterVolumeSpecName: "kube-api-access-k45rn") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "kube-api-access-k45rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.413977 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.414070 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84423eab78faf98d19a8f63b4de761c719e5f7e98ae028cf8dc7a99f1fabf2c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.414608 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" event={"ID":"1a49f854-d439-4cc8-8183-84f0b5c6a0b1","Type":"ContainerDied","Data":"3ed26745d63fe32cc6e166362e8e5066f5644fbd0814d3d2c7677dd830ab5359"} Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.414655 4830 scope.go:117] "RemoveContainer" containerID="f87921027bd0ee6986522a671cbbec41dce3432f40ce134e0c856c3942e1b174" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.414770 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tl8d8" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.439093 4830 scope.go:117] "RemoveContainer" containerID="d5815704f76914b486b6a708a7fb1c30c7b64e4af21e6d3386120c518c0de336" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.457127 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.491262 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k45rn\" (UniqueName: \"kubernetes.io/projected/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-kube-api-access-k45rn\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.491285 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.598999 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.607462 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.617984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.621357 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-z5lqr"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.627823 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config" (OuterVolumeSpecName: "config") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.648339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:28 crc kubenswrapper[4830]: W1203 22:26:28.648421 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f855a9c_02a9_47af_832c_4a48c1cb26ff.slice/crio-34fc9dc82e96eac43a58a72e73553400a113cd6dcd5ff9886fe64e582f3af392 WatchSource:0}: Error finding container 34fc9dc82e96eac43a58a72e73553400a113cd6dcd5ff9886fe64e582f3af392: Status 404 returned error can't find the container with id 34fc9dc82e96eac43a58a72e73553400a113cd6dcd5ff9886fe64e582f3af392 Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.658542 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.668530 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-67qfl"] Dec 03 22:26:28 crc kubenswrapper[4830]: W1203 22:26:28.668988 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f0abb0_964c_42d5_8a2d_2cdf84d049c7.slice/crio-26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490 WatchSource:0}: Error finding container 26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490: Status 404 returned error can't find the container with id 26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490 Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.682420 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a49f854-d439-4cc8-8183-84f0b5c6a0b1" (UID: "1a49f854-d439-4cc8-8183-84f0b5c6a0b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.694951 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.694985 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.694996 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.695004 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a49f854-d439-4cc8-8183-84f0b5c6a0b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.770320 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.778158 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tl8d8"] Dec 03 22:26:28 crc kubenswrapper[4830]: I1203 22:26:28.884240 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.029146 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-94wsw"] Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.051287 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.076347 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4g65n"] Dec 03 22:26:29 crc kubenswrapper[4830]: W1203 22:26:29.085250 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cee81fc_684a_4fd2_886a_d899c16a5f8b.slice/crio-4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d WatchSource:0}: Error finding container 4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d: Status 404 returned error can't find the container with id 4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.086938 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rjqfw"] Dec 03 22:26:29 crc kubenswrapper[4830]: W1203 22:26:29.095398 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08a2474_4282_4814_8d14_438d92f1c593.slice/crio-f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206 WatchSource:0}: Error finding container f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206: Status 404 returned error can't find the container with id f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206 Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.232102 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:29 crc kubenswrapper[4830]: W1203 22:26:29.254135 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf130ecd2_4428_4f1e_a386_7084fc52689b.slice/crio-99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370 WatchSource:0}: Error finding container 99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370: Status 404 returned error can't find the container with id 99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370 Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.368099 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" path="/var/lib/kubelet/pods/1a49f854-d439-4cc8-8183-84f0b5c6a0b1/volumes" Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.420012 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.455781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qdm4z" event={"ID":"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94","Type":"ContainerStarted","Data":"d54586ae7f972138b491d4d8a5a3b90851ec3a1596fd487fa8203019a4d7ea6c"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.464098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4g65n" event={"ID":"2cee81fc-684a-4fd2-886a-d899c16a5f8b","Type":"ContainerStarted","Data":"4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.492629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjqfw" event={"ID":"f130ecd2-4428-4f1e-a386-7084fc52689b","Type":"ContainerStarted","Data":"99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.507670 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qdm4z" podStartSLOduration=3.507652242 podStartE2EDuration="3.507652242s" podCreationTimestamp="2025-12-03 22:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:29.475252812 +0000 UTC m=+1278.471714161" watchObservedRunningTime="2025-12-03 22:26:29.507652242 +0000 UTC m=+1278.504113581" Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.543808 4830 generic.go:334] "Generic (PLEG): container finished" podID="c79b5b83-0911-4e22-83b3-566441e0a6c0" containerID="d3e403d96035f70544ccc26b824ac24136e842fbc840d4b49557d729374e9526" exitCode=0 Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.543885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" event={"ID":"c79b5b83-0911-4e22-83b3-566441e0a6c0","Type":"ContainerDied","Data":"d3e403d96035f70544ccc26b824ac24136e842fbc840d4b49557d729374e9526"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.543910 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" event={"ID":"c79b5b83-0911-4e22-83b3-566441e0a6c0","Type":"ContainerStarted","Data":"97bff2f43668e6d55db32ea99268c917d0033216dd9339c7ed46823a8c04e130"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.570746 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.572810 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z5lqr" event={"ID":"a9793d86-3f26-4443-b740-c2bbcc65f58c","Type":"ContainerStarted","Data":"2a17ed687a3a72ba98e0d26dd5d66778270bf8ee1bd004597bc7266e8f52cab6"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.602653 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerStarted","Data":"34fc9dc82e96eac43a58a72e73553400a113cd6dcd5ff9886fe64e582f3af392"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.621469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-67qfl" event={"ID":"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7","Type":"ContainerStarted","Data":"26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.650739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-94wsw" event={"ID":"e08a2474-4282-4814-8d14-438d92f1c593","Type":"ContainerStarted","Data":"f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.653807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerStarted","Data":"6f08d2886acc62c721da8146b185d9a1c4715b9a2bbf8e0eceaa00f87801142a"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.670222 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" event={"ID":"6e96cd45-9150-437d-bbe7-4f3df3c40d1e","Type":"ContainerStarted","Data":"7e4e9b387ec50ad2866e0ddfa36298c653f31a60ad0e0d29f24db20959185ea7"} Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.679166 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:29 crc kubenswrapper[4830]: I1203 22:26:29.848575 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.487239 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566458 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566559 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vlzb\" (UniqueName: \"kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566641 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566678 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566854 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.566903 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb\") pod \"c79b5b83-0911-4e22-83b3-566441e0a6c0\" (UID: \"c79b5b83-0911-4e22-83b3-566441e0a6c0\") " Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.576366 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb" (OuterVolumeSpecName: "kube-api-access-2vlzb") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "kube-api-access-2vlzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.601020 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.606549 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.615654 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config" (OuterVolumeSpecName: "config") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.616902 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.637125 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c79b5b83-0911-4e22-83b3-566441e0a6c0" (UID: "c79b5b83-0911-4e22-83b3-566441e0a6c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669539 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669840 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669851 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669861 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vlzb\" (UniqueName: \"kubernetes.io/projected/c79b5b83-0911-4e22-83b3-566441e0a6c0-kube-api-access-2vlzb\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669871 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.669880 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5b83-0911-4e22-83b3-566441e0a6c0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.684466 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" event={"ID":"c79b5b83-0911-4e22-83b3-566441e0a6c0","Type":"ContainerDied","Data":"97bff2f43668e6d55db32ea99268c917d0033216dd9339c7ed46823a8c04e130"} Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.684484 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-kbt8k" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.684589 4830 scope.go:117] "RemoveContainer" containerID="d3e403d96035f70544ccc26b824ac24136e842fbc840d4b49557d729374e9526" Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.693240 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerStarted","Data":"20cd821764f5ad73ed59de4fb087019d81e650514641f290d3ecaa8714f3797e"} Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.714620 4830 generic.go:334] "Generic (PLEG): container finished" podID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerID="f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4" exitCode=0 Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.714715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" event={"ID":"6e96cd45-9150-437d-bbe7-4f3df3c40d1e","Type":"ContainerDied","Data":"f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4"} Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.738587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4g65n" event={"ID":"2cee81fc-684a-4fd2-886a-d899c16a5f8b","Type":"ContainerStarted","Data":"2e900a8f197bb84a2334d2d3e0082a9759630f7ff33a9805947d537402c76536"} Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.766034 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.774377 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-kbt8k"] Dec 03 22:26:30 crc kubenswrapper[4830]: I1203 22:26:30.779380 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4g65n" podStartSLOduration=3.7793640440000003 podStartE2EDuration="3.779364044s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:30.758663371 +0000 UTC m=+1279.755124720" watchObservedRunningTime="2025-12-03 22:26:30.779364044 +0000 UTC m=+1279.775825393" Dec 03 22:26:31 crc kubenswrapper[4830]: I1203 22:26:31.353295 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79b5b83-0911-4e22-83b3-566441e0a6c0" path="/var/lib/kubelet/pods/c79b5b83-0911-4e22-83b3-566441e0a6c0/volumes" Dec 03 22:26:31 crc kubenswrapper[4830]: I1203 22:26:31.789210 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerStarted","Data":"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e"} Dec 03 22:26:31 crc kubenswrapper[4830]: I1203 22:26:31.796671 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerStarted","Data":"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba"} Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.864606 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerStarted","Data":"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520"} Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.864944 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-log" containerID="cri-o://f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" gracePeriod=30 Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.865234 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-httpd" containerID="cri-o://6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" gracePeriod=30 Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.883967 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerStarted","Data":"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78"} Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.884125 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-log" containerID="cri-o://4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" gracePeriod=30 Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.884238 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-httpd" containerID="cri-o://6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" gracePeriod=30 Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.889555 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.889538955 podStartE2EDuration="6.889538955s" podCreationTimestamp="2025-12-03 22:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:32.886954595 +0000 UTC m=+1281.883415944" watchObservedRunningTime="2025-12-03 22:26:32.889538955 +0000 UTC m=+1281.886000294" Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.906786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" event={"ID":"6e96cd45-9150-437d-bbe7-4f3df3c40d1e","Type":"ContainerStarted","Data":"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea"} Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.907346 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.923880 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.923863979 podStartE2EDuration="6.923863979s" podCreationTimestamp="2025-12-03 22:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:32.916288873 +0000 UTC m=+1281.912750222" watchObservedRunningTime="2025-12-03 22:26:32.923863979 +0000 UTC m=+1281.920325328" Dec 03 22:26:32 crc kubenswrapper[4830]: I1203 22:26:32.950246 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" podStartSLOduration=5.950226076 podStartE2EDuration="5.950226076s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:32.946194836 +0000 UTC m=+1281.942656175" watchObservedRunningTime="2025-12-03 22:26:32.950226076 +0000 UTC m=+1281.946687425" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.637343 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742145 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742390 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742441 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742469 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nscvv\" (UniqueName: \"kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742517 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742569 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742615 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.742672 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs\") pod \"bcc681c7-0cca-4f44-b042-94537d4ac99d\" (UID: \"bcc681c7-0cca-4f44-b042-94537d4ac99d\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.743396 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs" (OuterVolumeSpecName: "logs") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.747805 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.758157 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts" (OuterVolumeSpecName: "scripts") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.761197 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv" (OuterVolumeSpecName: "kube-api-access-nscvv") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "kube-api-access-nscvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.775197 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180" (OuterVolumeSpecName: "glance") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.796042 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.801883 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data" (OuterVolumeSpecName: "config-data") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.817528 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bcc681c7-0cca-4f44-b042-94537d4ac99d" (UID: "bcc681c7-0cca-4f44-b042-94537d4ac99d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845327 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845400 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") on node \"crc\" " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845417 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845429 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nscvv\" (UniqueName: \"kubernetes.io/projected/bcc681c7-0cca-4f44-b042-94537d4ac99d-kube-api-access-nscvv\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845441 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845451 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc681c7-0cca-4f44-b042-94537d4ac99d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845461 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.845472 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc681c7-0cca-4f44-b042-94537d4ac99d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.883617 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.883784 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180") on node "crc" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.894561 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956534 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956611 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956635 4830 generic.go:334] "Generic (PLEG): container finished" podID="abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" containerID="d54586ae7f972138b491d4d8a5a3b90851ec3a1596fd487fa8203019a4d7ea6c" exitCode=0 Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956714 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qdm4z" event={"ID":"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94","Type":"ContainerDied","Data":"d54586ae7f972138b491d4d8a5a3b90851ec3a1596fd487fa8203019a4d7ea6c"} Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956722 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956778 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdhkh\" (UniqueName: \"kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.956986 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.957045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data\") pod \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\" (UID: \"eacca0c5-e49f-418d-b30f-a8e3cd47a20b\") " Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.957323 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.957777 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.957798 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.959436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs" (OuterVolumeSpecName: "logs") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963398 4830 generic.go:334] "Generic (PLEG): container finished" podID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerID="6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" exitCode=0 Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963428 4830 generic.go:334] "Generic (PLEG): container finished" podID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerID="f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" exitCode=143 Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963484 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerDied","Data":"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520"} Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerDied","Data":"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba"} Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963569 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcc681c7-0cca-4f44-b042-94537d4ac99d","Type":"ContainerDied","Data":"20cd821764f5ad73ed59de4fb087019d81e650514641f290d3ecaa8714f3797e"} Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963596 4830 scope.go:117] "RemoveContainer" containerID="6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.963808 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.975218 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d" (OuterVolumeSpecName: "glance") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.981846 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts" (OuterVolumeSpecName: "scripts") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.985063 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh" (OuterVolumeSpecName: "kube-api-access-pdhkh") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "kube-api-access-pdhkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:33 crc kubenswrapper[4830]: I1203 22:26:33.999585 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.011137 4830 generic.go:334] "Generic (PLEG): container finished" podID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerID="6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" exitCode=0 Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.011169 4830 generic.go:334] "Generic (PLEG): container finished" podID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerID="4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" exitCode=143 Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.012416 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.012980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerDied","Data":"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78"} Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.013009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerDied","Data":"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e"} Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.013029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eacca0c5-e49f-418d-b30f-a8e3cd47a20b","Type":"ContainerDied","Data":"6f08d2886acc62c721da8146b185d9a1c4715b9a2bbf8e0eceaa00f87801142a"} Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.063189 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.063222 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdhkh\" (UniqueName: \"kubernetes.io/projected/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-kube-api-access-pdhkh\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.063257 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") on node \"crc\" " Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.063273 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.063284 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.073050 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.111380 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data" (OuterVolumeSpecName: "config-data") pod "eacca0c5-e49f-418d-b30f-a8e3cd47a20b" (UID: "eacca0c5-e49f-418d-b30f-a8e3cd47a20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.113276 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.113410 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d") on node "crc" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.165409 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.165439 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.165449 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eacca0c5-e49f-418d-b30f-a8e3cd47a20b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.222847 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.246615 4830 scope.go:117] "RemoveContainer" containerID="f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.249102 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265028 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265493 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79b5b83-0911-4e22-83b3-566441e0a6c0" containerName="init" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265520 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79b5b83-0911-4e22-83b3-566441e0a6c0" containerName="init" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265528 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265534 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265548 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265554 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265565 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="init" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265570 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="init" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265581 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="dnsmasq-dns" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265586 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="dnsmasq-dns" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265601 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265607 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.265622 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265627 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265800 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265814 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265834 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79b5b83-0911-4e22-83b3-566441e0a6c0" containerName="init" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265841 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-log" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265849 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a49f854-d439-4cc8-8183-84f0b5c6a0b1" containerName="dnsmasq-dns" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.265855 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" containerName="glance-httpd" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.269490 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.272602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.272622 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.272819 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.306755 4830 scope.go:117] "RemoveContainer" containerID="6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.309084 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520\": container with ID starting with 6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520 not found: ID does not exist" containerID="6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.309132 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520"} err="failed to get container status \"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520\": rpc error: code = NotFound desc = could not find container \"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520\": container with ID starting with 6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520 not found: ID does not exist" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.309161 4830 scope.go:117] "RemoveContainer" containerID="f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" Dec 03 22:26:34 crc kubenswrapper[4830]: E1203 22:26:34.309827 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba\": container with ID starting with f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba not found: ID does not exist" containerID="f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.309875 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba"} err="failed to get container status \"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba\": rpc error: code = NotFound desc = could not find container \"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba\": container with ID starting with f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba not found: ID does not exist" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.309901 4830 scope.go:117] "RemoveContainer" containerID="6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.310164 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520"} err="failed to get container status \"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520\": rpc error: code = NotFound desc = could not find container \"6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520\": container with ID starting with 6f6fc886b918f2bfccefa8e5047c1b0edae7001adf706e3e1646b4076ca0d520 not found: ID does not exist" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.310241 4830 scope.go:117] "RemoveContainer" containerID="f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.312956 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba"} err="failed to get container status \"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba\": rpc error: code = NotFound desc = could not find container \"f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba\": container with ID starting with f38ad4c9b5f70f836e3779b6c838a7341ba1d369137a3a67e718833cc3682dba not found: ID does not exist" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.312981 4830 scope.go:117] "RemoveContainer" containerID="6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.354388 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.373550 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.373593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5r6\" (UniqueName: \"kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.373626 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.373664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.373894 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.374074 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.374175 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.374255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.380189 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.396069 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.397745 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.404021 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.407236 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.407474 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476238 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476290 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476472 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476579 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476621 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476644 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7nf\" (UniqueName: \"kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5r6\" (UniqueName: \"kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476699 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.476759 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.478877 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.479235 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.480672 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.480700 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84423eab78faf98d19a8f63b4de761c719e5f7e98ae028cf8dc7a99f1fabf2c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.484187 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.484759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.485591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.486307 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.497347 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5r6\" (UniqueName: \"kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.572335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.578913 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.578971 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579001 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579100 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579200 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579265 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7nf\" (UniqueName: \"kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.579831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.581073 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.581158 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb08e8461f959e86adb7412ffdaebbf2e0d85ec485d90c5b01024a06c9055cd6/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.582254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.582450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.586048 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.586552 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.592897 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.606608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7nf\" (UniqueName: \"kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.620770 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " pod="openstack/glance-default-external-api-0" Dec 03 22:26:34 crc kubenswrapper[4830]: I1203 22:26:34.721023 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:26:35 crc kubenswrapper[4830]: I1203 22:26:35.354725 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc681c7-0cca-4f44-b042-94537d4ac99d" path="/var/lib/kubelet/pods/bcc681c7-0cca-4f44-b042-94537d4ac99d/volumes" Dec 03 22:26:35 crc kubenswrapper[4830]: I1203 22:26:35.356344 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eacca0c5-e49f-418d-b30f-a8e3cd47a20b" path="/var/lib/kubelet/pods/eacca0c5-e49f-418d-b30f-a8e3cd47a20b/volumes" Dec 03 22:26:37 crc kubenswrapper[4830]: I1203 22:26:37.680700 4830 scope.go:117] "RemoveContainer" containerID="4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.012488 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.079382 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.079619 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" containerID="cri-o://9dfdfaad9bfdf2d13d69b094f175853adcce7f24de017b853eb1ce418bd52a17" gracePeriod=10 Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.510080 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595127 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrt8\" (UniqueName: \"kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595261 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595332 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595376 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595494 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.595537 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys\") pod \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\" (UID: \"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94\") " Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.601947 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts" (OuterVolumeSpecName: "scripts") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.604659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.614228 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.614387 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8" (OuterVolumeSpecName: "kube-api-access-4mrt8") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "kube-api-access-4mrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.632483 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.640158 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data" (OuterVolumeSpecName: "config-data") pod "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" (UID: "abc2ec15-cd54-4bb4-9bd1-d882f1f61b94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697566 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697596 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697607 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697618 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrt8\" (UniqueName: \"kubernetes.io/projected/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-kube-api-access-4mrt8\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697627 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:38 crc kubenswrapper[4830]: I1203 22:26:38.697634 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.157480 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerID="9dfdfaad9bfdf2d13d69b094f175853adcce7f24de017b853eb1ce418bd52a17" exitCode=0 Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.157643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xzbv8" event={"ID":"ae971dc1-0fb1-482a-a05a-2aa2adb99a53","Type":"ContainerDied","Data":"9dfdfaad9bfdf2d13d69b094f175853adcce7f24de017b853eb1ce418bd52a17"} Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.162035 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qdm4z" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.161826 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qdm4z" event={"ID":"abc2ec15-cd54-4bb4-9bd1-d882f1f61b94","Type":"ContainerDied","Data":"4c2688c657ba1b7d8742f81989aebd098cdbfd4f72ae549bf4a4c1fbc5a7baba"} Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.163444 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2688c657ba1b7d8742f81989aebd098cdbfd4f72ae549bf4a4c1fbc5a7baba" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.645204 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qdm4z"] Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.674216 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qdm4z"] Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.754561 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l6rwx"] Dec 03 22:26:39 crc kubenswrapper[4830]: E1203 22:26:39.755108 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" containerName="keystone-bootstrap" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.755121 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" containerName="keystone-bootstrap" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.755335 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" containerName="keystone-bootstrap" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.756237 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.759339 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.759698 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.759858 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.760056 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6cfjx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.772207 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l6rwx"] Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820737 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820772 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ckl\" (UniqueName: \"kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820839 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.820881 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.922626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.923050 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.923100 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.923161 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.923195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ckl\" (UniqueName: \"kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.923315 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.927065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.927785 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.928795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.929081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.944251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:39 crc kubenswrapper[4830]: I1203 22:26:39.953319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ckl\" (UniqueName: \"kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl\") pod \"keystone-bootstrap-l6rwx\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:40 crc kubenswrapper[4830]: I1203 22:26:40.085517 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:26:41 crc kubenswrapper[4830]: I1203 22:26:41.233444 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 22:26:41 crc kubenswrapper[4830]: I1203 22:26:41.350885 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc2ec15-cd54-4bb4-9bd1-d882f1f61b94" path="/var/lib/kubelet/pods/abc2ec15-cd54-4bb4-9bd1-d882f1f61b94/volumes" Dec 03 22:26:46 crc kubenswrapper[4830]: I1203 22:26:46.234279 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 03 22:26:48 crc kubenswrapper[4830]: I1203 22:26:48.264558 4830 generic.go:334] "Generic (PLEG): container finished" podID="2cee81fc-684a-4fd2-886a-d899c16a5f8b" containerID="2e900a8f197bb84a2334d2d3e0082a9759630f7ff33a9805947d537402c76536" exitCode=0 Dec 03 22:26:48 crc kubenswrapper[4830]: I1203 22:26:48.264634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4g65n" event={"ID":"2cee81fc-684a-4fd2-886a-d899c16a5f8b","Type":"ContainerDied","Data":"2e900a8f197bb84a2334d2d3e0082a9759630f7ff33a9805947d537402c76536"} Dec 03 22:26:56 crc kubenswrapper[4830]: I1203 22:26:56.233957 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 03 22:26:56 crc kubenswrapper[4830]: I1203 22:26:56.236109 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:27:01 crc kubenswrapper[4830]: I1203 22:27:01.236806 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 03 22:27:01 crc kubenswrapper[4830]: E1203 22:27:01.636586 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 22:27:01 crc kubenswrapper[4830]: E1203 22:27:01.636877 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgs9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-94wsw_openstack(e08a2474-4282-4814-8d14-438d92f1c593): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:27:01 crc kubenswrapper[4830]: E1203 22:27:01.639265 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-94wsw" podUID="e08a2474-4282-4814-8d14-438d92f1c593" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.092220 4830 scope.go:117] "RemoveContainer" containerID="6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" Dec 03 22:27:02 crc kubenswrapper[4830]: E1203 22:27:02.092846 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78\": container with ID starting with 6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78 not found: ID does not exist" containerID="6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.092925 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78"} err="failed to get container status \"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78\": rpc error: code = NotFound desc = could not find container \"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78\": container with ID starting with 6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78 not found: ID does not exist" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.092968 4830 scope.go:117] "RemoveContainer" containerID="4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" Dec 03 22:27:02 crc kubenswrapper[4830]: E1203 22:27:02.093395 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e\": container with ID starting with 4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e not found: ID does not exist" containerID="4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.093437 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e"} err="failed to get container status \"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e\": rpc error: code = NotFound desc = could not find container \"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e\": container with ID starting with 4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e not found: ID does not exist" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.093466 4830 scope.go:117] "RemoveContainer" containerID="6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.093941 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78"} err="failed to get container status \"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78\": rpc error: code = NotFound desc = could not find container \"6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78\": container with ID starting with 6339a4ecc03ea69f64d3ac50bbfc07d5ccc0edcba7569c96277ba21b75658f78 not found: ID does not exist" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.093966 4830 scope.go:117] "RemoveContainer" containerID="4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.094232 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e"} err="failed to get container status \"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e\": rpc error: code = NotFound desc = could not find container \"4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e\": container with ID starting with 4a972af2f069956a32cc8685ba52db3441a07ddd4060043c3ebda0844b09962e not found: ID does not exist" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.236853 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.245274 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4g65n" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318347 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config\") pod \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318409 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle\") pod \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318444 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnqw\" (UniqueName: \"kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw\") pod \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\" (UID: \"2cee81fc-684a-4fd2-886a-d899c16a5f8b\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318463 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlmrw\" (UniqueName: \"kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw\") pod \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318651 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb\") pod \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.318775 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb\") pod \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.319243 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc\") pod \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.319301 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config\") pod \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\" (UID: \"ae971dc1-0fb1-482a-a05a-2aa2adb99a53\") " Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.323268 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw" (OuterVolumeSpecName: "kube-api-access-vwnqw") pod "2cee81fc-684a-4fd2-886a-d899c16a5f8b" (UID: "2cee81fc-684a-4fd2-886a-d899c16a5f8b"). InnerVolumeSpecName "kube-api-access-vwnqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.340986 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw" (OuterVolumeSpecName: "kube-api-access-wlmrw") pod "ae971dc1-0fb1-482a-a05a-2aa2adb99a53" (UID: "ae971dc1-0fb1-482a-a05a-2aa2adb99a53"). InnerVolumeSpecName "kube-api-access-wlmrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.350448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cee81fc-684a-4fd2-886a-d899c16a5f8b" (UID: "2cee81fc-684a-4fd2-886a-d899c16a5f8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.356797 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config" (OuterVolumeSpecName: "config") pod "2cee81fc-684a-4fd2-886a-d899c16a5f8b" (UID: "2cee81fc-684a-4fd2-886a-d899c16a5f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.365834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config" (OuterVolumeSpecName: "config") pod "ae971dc1-0fb1-482a-a05a-2aa2adb99a53" (UID: "ae971dc1-0fb1-482a-a05a-2aa2adb99a53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.385923 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae971dc1-0fb1-482a-a05a-2aa2adb99a53" (UID: "ae971dc1-0fb1-482a-a05a-2aa2adb99a53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.385939 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae971dc1-0fb1-482a-a05a-2aa2adb99a53" (UID: "ae971dc1-0fb1-482a-a05a-2aa2adb99a53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.397897 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae971dc1-0fb1-482a-a05a-2aa2adb99a53" (UID: "ae971dc1-0fb1-482a-a05a-2aa2adb99a53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.411629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xzbv8" event={"ID":"ae971dc1-0fb1-482a-a05a-2aa2adb99a53","Type":"ContainerDied","Data":"549814468a3a5d0cb5b1ee9603d5aff88fde36f23ed4a615d05e523516def7f2"} Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.411676 4830 scope.go:117] "RemoveContainer" containerID="9dfdfaad9bfdf2d13d69b094f175853adcce7f24de017b853eb1ce418bd52a17" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.411762 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xzbv8" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.418217 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4g65n" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.418211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4g65n" event={"ID":"2cee81fc-684a-4fd2-886a-d899c16a5f8b","Type":"ContainerDied","Data":"4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d"} Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.418647 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee7ff51b8a335cda0ebfb8be4718486bc837e5590365cff1a26c277646c0c3d" Dec 03 22:27:02 crc kubenswrapper[4830]: E1203 22:27:02.433415 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-94wsw" podUID="e08a2474-4282-4814-8d14-438d92f1c593" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435047 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435077 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435093 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cee81fc-684a-4fd2-886a-d899c16a5f8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435119 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlmrw\" (UniqueName: \"kubernetes.io/projected/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-kube-api-access-wlmrw\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435134 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnqw\" (UniqueName: \"kubernetes.io/projected/2cee81fc-684a-4fd2-886a-d899c16a5f8b-kube-api-access-vwnqw\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435147 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435199 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.435222 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae971dc1-0fb1-482a-a05a-2aa2adb99a53-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.486733 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:27:02 crc kubenswrapper[4830]: I1203 22:27:02.493984 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xzbv8"] Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.342824 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.343037 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wgmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-z5lqr_openstack(a9793d86-3f26-4443-b740-c2bbcc65f58c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.345001 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-z5lqr" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.350407 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" path="/var/lib/kubelet/pods/ae971dc1-0fb1-482a-a05a-2aa2adb99a53/volumes" Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.443378 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-z5lqr" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.493171 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.493706 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="init" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.493723 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="init" Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.493742 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.493750 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" Dec 03 22:27:03 crc kubenswrapper[4830]: E1203 22:27:03.493786 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cee81fc-684a-4fd2-886a-d899c16a5f8b" containerName="neutron-db-sync" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.493797 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cee81fc-684a-4fd2-886a-d899c16a5f8b" containerName="neutron-db-sync" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.494016 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.494050 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cee81fc-684a-4fd2-886a-d899c16a5f8b" containerName="neutron-db-sync" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.495399 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.503623 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.556820 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllpw\" (UniqueName: \"kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.556893 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.556914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.557063 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.557374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.557441 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.581676 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.583307 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.585561 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.585728 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.585840 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.588009 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m9jgq" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.610517 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.658973 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659011 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659034 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659074 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659182 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659219 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5p58\" (UniqueName: \"kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659239 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659256 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659462 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659565 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dllpw\" (UniqueName: \"kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659829 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.659956 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.660028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.660093 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.660627 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.689294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllpw\" (UniqueName: \"kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw\") pod \"dnsmasq-dns-55f844cf75-jrb5f\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.761283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.761349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.761422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.761466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.761498 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5p58\" (UniqueName: \"kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.766455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.778378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.778422 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.779468 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.781751 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5p58\" (UniqueName: \"kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58\") pod \"neutron-6ffd7655fb-lgwcj\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.810297 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:03 crc kubenswrapper[4830]: I1203 22:27:03.904084 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.788195 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57d569655f-t92g7"] Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.790285 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.803841 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.804016 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.835476 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d569655f-t92g7"] Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-internal-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920668 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-httpd-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920690 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-combined-ca-bundle\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920761 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-ovndb-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920778 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlw2f\" (UniqueName: \"kubernetes.io/projected/6d9377f1-1fe5-4451-8224-b5e9e253efa5-kube-api-access-jlw2f\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:05 crc kubenswrapper[4830]: I1203 22:27:05.920851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-public-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.022862 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-public-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.022947 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-internal-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.022984 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-httpd-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.023003 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-combined-ca-bundle\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.023064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-ovndb-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.023082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.023111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlw2f\" (UniqueName: \"kubernetes.io/projected/6d9377f1-1fe5-4451-8224-b5e9e253efa5-kube-api-access-jlw2f\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.029256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-httpd-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.029792 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-public-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.031104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-internal-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.032405 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-config\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.047746 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-ovndb-tls-certs\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.047884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9377f1-1fe5-4451-8224-b5e9e253efa5-combined-ca-bundle\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.054262 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlw2f\" (UniqueName: \"kubernetes.io/projected/6d9377f1-1fe5-4451-8224-b5e9e253efa5-kube-api-access-jlw2f\") pod \"neutron-57d569655f-t92g7\" (UID: \"6d9377f1-1fe5-4451-8224-b5e9e253efa5\") " pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.118255 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:06 crc kubenswrapper[4830]: I1203 22:27:06.241733 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xzbv8" podUID="ae971dc1-0fb1-482a-a05a-2aa2adb99a53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 03 22:27:07 crc kubenswrapper[4830]: I1203 22:27:07.020255 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:27:07 crc kubenswrapper[4830]: I1203 22:27:07.733893 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:27:09 crc kubenswrapper[4830]: W1203 22:27:09.288747 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa8d18ad_39c6_4264_9a3d_cab20b1ea138.slice/crio-88ee8a3ca1fbdd48d73582bd9e684726e3b6445920e298f7779e39d48b22ec47 WatchSource:0}: Error finding container 88ee8a3ca1fbdd48d73582bd9e684726e3b6445920e298f7779e39d48b22ec47: Status 404 returned error can't find the container with id 88ee8a3ca1fbdd48d73582bd9e684726e3b6445920e298f7779e39d48b22ec47 Dec 03 22:27:09 crc kubenswrapper[4830]: I1203 22:27:09.492086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerStarted","Data":"88ee8a3ca1fbdd48d73582bd9e684726e3b6445920e298f7779e39d48b22ec47"} Dec 03 22:27:09 crc kubenswrapper[4830]: I1203 22:27:09.696990 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l6rwx"] Dec 03 22:27:09 crc kubenswrapper[4830]: W1203 22:27:09.908394 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e75b14_a94b_4936_bd47_9da029e04272.slice/crio-ae233b75f6b4dccd99582c5b4d7b2a2ec695715ae60e3957ed9bef9c2ccbf382 WatchSource:0}: Error finding container ae233b75f6b4dccd99582c5b4d7b2a2ec695715ae60e3957ed9bef9c2ccbf382: Status 404 returned error can't find the container with id ae233b75f6b4dccd99582c5b4d7b2a2ec695715ae60e3957ed9bef9c2ccbf382 Dec 03 22:27:09 crc kubenswrapper[4830]: I1203 22:27:09.928039 4830 scope.go:117] "RemoveContainer" containerID="69867339cd6a8969d75a977c4cf3c9124d356d52614a9c93606e04bceeb178fc" Dec 03 22:27:09 crc kubenswrapper[4830]: W1203 22:27:09.939925 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6917fbd_195a_4b58_83e0_27988c69b57d.slice/crio-03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b WatchSource:0}: Error finding container 03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b: Status 404 returned error can't find the container with id 03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b Dec 03 22:27:09 crc kubenswrapper[4830]: E1203 22:27:09.961687 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Dec 03 22:27:09 crc kubenswrapper[4830]: E1203 22:27:09.961726 4830 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Dec 03 22:27:09 crc kubenswrapper[4830]: E1203 22:27:09.961826 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mk946,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-67qfl_openstack(d4f0abb0-964c-42d5-8a2d-2cdf84d049c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:27:09 crc kubenswrapper[4830]: E1203 22:27:09.963543 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-67qfl" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.326057 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.514568 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.523337 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.524689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6rwx" event={"ID":"e6917fbd-195a-4b58-83e0-27988c69b57d","Type":"ContainerStarted","Data":"17947ed7c1c22b53408db6b030acdd4dbae1374c362e3ce0d46a0ddd0f87d478"} Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.524730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6rwx" event={"ID":"e6917fbd-195a-4b58-83e0-27988c69b57d","Type":"ContainerStarted","Data":"03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b"} Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.562086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerStarted","Data":"ae233b75f6b4dccd99582c5b4d7b2a2ec695715ae60e3957ed9bef9c2ccbf382"} Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.578936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjqfw" event={"ID":"f130ecd2-4428-4f1e-a386-7084fc52689b","Type":"ContainerStarted","Data":"a4b37c265b5d4fd9e3da2072f0efcfb1376ff8ea294f05fd856a5b2409bc42d3"} Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.586703 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerStarted","Data":"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056"} Dec 03 22:27:10 crc kubenswrapper[4830]: E1203 22:27:10.587154 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-67qfl" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.595711 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rjqfw" podStartSLOduration=10.752001908 podStartE2EDuration="43.595694035s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="2025-12-03 22:26:29.270972577 +0000 UTC m=+1278.267433926" lastFinishedPulling="2025-12-03 22:27:02.114664664 +0000 UTC m=+1311.111126053" observedRunningTime="2025-12-03 22:27:10.595239063 +0000 UTC m=+1319.591700412" watchObservedRunningTime="2025-12-03 22:27:10.595694035 +0000 UTC m=+1319.592155384" Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.612206 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l6rwx" podStartSLOduration=31.612183771 podStartE2EDuration="31.612183771s" podCreationTimestamp="2025-12-03 22:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:10.55742893 +0000 UTC m=+1319.553890279" watchObservedRunningTime="2025-12-03 22:27:10.612183771 +0000 UTC m=+1319.608645130" Dec 03 22:27:10 crc kubenswrapper[4830]: I1203 22:27:10.681554 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57d569655f-t92g7"] Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.624015 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerStarted","Data":"b9111f05eccde3e8cde1eb9e6e9e0610af3ee94be99bcabe9f8d6bd149b26ed6"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.635894 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerStarted","Data":"d2c625c0c678eff611eb1a6c14b34de1c71c8e5eca7ac7a7b00fa0643798dc98"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.635935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerStarted","Data":"81a9c81b1d68fa3cbb7506f86caed908a6ae86e2a155eac519d103ea8d8850bd"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.635944 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerStarted","Data":"505a350f57ae5f32c7453e484113d0e182005fb1a8b6cb6ed103a8df1507ff7a"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.636629 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.643012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerStarted","Data":"9e85cb9626d98c593715b25c00bbb3fd667dd9c4f6ae7a01c800d755aae0bed6"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.646675 4830 generic.go:334] "Generic (PLEG): container finished" podID="1761df38-e690-4267-bc27-35ee08e90130" containerID="f805477801e379cd71323c0aac7d81c3f814d15f1dd478512e4d1db7783f4de9" exitCode=0 Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.646731 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" event={"ID":"1761df38-e690-4267-bc27-35ee08e90130","Type":"ContainerDied","Data":"f805477801e379cd71323c0aac7d81c3f814d15f1dd478512e4d1db7783f4de9"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.646758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" event={"ID":"1761df38-e690-4267-bc27-35ee08e90130","Type":"ContainerStarted","Data":"8dbbe8efb506566257b9fa3f9a249746247083b366914afd9190ac56e9eb534a"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.673793 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6ffd7655fb-lgwcj" podStartSLOduration=8.673772633 podStartE2EDuration="8.673772633s" podCreationTimestamp="2025-12-03 22:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:11.661734408 +0000 UTC m=+1320.658195757" watchObservedRunningTime="2025-12-03 22:27:11.673772633 +0000 UTC m=+1320.670233982" Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.677758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d569655f-t92g7" event={"ID":"6d9377f1-1fe5-4451-8224-b5e9e253efa5","Type":"ContainerStarted","Data":"9eae2d7d84785bbb1b19460e6077def33b38a9272a3904881461cdeffd12b182"} Dec 03 22:27:11 crc kubenswrapper[4830]: I1203 22:27:11.677790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d569655f-t92g7" event={"ID":"6d9377f1-1fe5-4451-8224-b5e9e253efa5","Type":"ContainerStarted","Data":"be9254b13b993273ad2050d804046a1d131426b8c102b6464bdec7a1d0639e07"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.685460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerStarted","Data":"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.687364 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerStarted","Data":"96a1c81884ef4bde5cf0b9533ed9dc7f960a61f9d7511b2c807f4859e8ef48c7"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.690774 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" event={"ID":"1761df38-e690-4267-bc27-35ee08e90130","Type":"ContainerStarted","Data":"b13bee9e39f1a04463e09a086e7208e4fd3c41e28c44587fc8464d629ac2ade7"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.691211 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.693136 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57d569655f-t92g7" event={"ID":"6d9377f1-1fe5-4451-8224-b5e9e253efa5","Type":"ContainerStarted","Data":"94649976f64b2c36ac2e836bac718685a18ad242691cc6f54e15931bbd7bfd52"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.693560 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.696690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerStarted","Data":"3e7c58e46ceaf242b90a388323d07ea8508b4538a702c15fdeb279488542a7e7"} Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.705932 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=38.705917068 podStartE2EDuration="38.705917068s" podCreationTimestamp="2025-12-03 22:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:12.703334609 +0000 UTC m=+1321.699795958" watchObservedRunningTime="2025-12-03 22:27:12.705917068 +0000 UTC m=+1321.702378417" Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.741041 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57d569655f-t92g7" podStartSLOduration=7.741023789 podStartE2EDuration="7.741023789s" podCreationTimestamp="2025-12-03 22:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:12.734814351 +0000 UTC m=+1321.731275700" watchObservedRunningTime="2025-12-03 22:27:12.741023789 +0000 UTC m=+1321.737485138" Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.772938 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=38.772924002 podStartE2EDuration="38.772924002s" podCreationTimestamp="2025-12-03 22:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:12.75842317 +0000 UTC m=+1321.754884519" watchObservedRunningTime="2025-12-03 22:27:12.772924002 +0000 UTC m=+1321.769385361" Dec 03 22:27:12 crc kubenswrapper[4830]: I1203 22:27:12.792890 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" podStartSLOduration=9.792869882 podStartE2EDuration="9.792869882s" podCreationTimestamp="2025-12-03 22:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:12.792689037 +0000 UTC m=+1321.789150396" watchObservedRunningTime="2025-12-03 22:27:12.792869882 +0000 UTC m=+1321.789331221" Dec 03 22:27:13 crc kubenswrapper[4830]: I1203 22:27:13.747594 4830 generic.go:334] "Generic (PLEG): container finished" podID="f130ecd2-4428-4f1e-a386-7084fc52689b" containerID="a4b37c265b5d4fd9e3da2072f0efcfb1376ff8ea294f05fd856a5b2409bc42d3" exitCode=0 Dec 03 22:27:13 crc kubenswrapper[4830]: I1203 22:27:13.747915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjqfw" event={"ID":"f130ecd2-4428-4f1e-a386-7084fc52689b","Type":"ContainerDied","Data":"a4b37c265b5d4fd9e3da2072f0efcfb1376ff8ea294f05fd856a5b2409bc42d3"} Dec 03 22:27:13 crc kubenswrapper[4830]: I1203 22:27:13.789009 4830 generic.go:334] "Generic (PLEG): container finished" podID="e6917fbd-195a-4b58-83e0-27988c69b57d" containerID="17947ed7c1c22b53408db6b030acdd4dbae1374c362e3ce0d46a0ddd0f87d478" exitCode=0 Dec 03 22:27:13 crc kubenswrapper[4830]: I1203 22:27:13.790146 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6rwx" event={"ID":"e6917fbd-195a-4b58-83e0-27988c69b57d","Type":"ContainerDied","Data":"17947ed7c1c22b53408db6b030acdd4dbae1374c362e3ce0d46a0ddd0f87d478"} Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.598915 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.598975 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.632714 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.665424 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.721769 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.721825 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.758621 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.767637 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.798960 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.799080 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.799433 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 22:27:14 crc kubenswrapper[4830]: I1203 22:27:14.799454 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.148556 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjqfw" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.274246 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data\") pod \"f130ecd2-4428-4f1e-a386-7084fc52689b\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.274345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts\") pod \"f130ecd2-4428-4f1e-a386-7084fc52689b\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.274423 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle\") pod \"f130ecd2-4428-4f1e-a386-7084fc52689b\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.274466 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs\") pod \"f130ecd2-4428-4f1e-a386-7084fc52689b\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.274579 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w66w7\" (UniqueName: \"kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7\") pod \"f130ecd2-4428-4f1e-a386-7084fc52689b\" (UID: \"f130ecd2-4428-4f1e-a386-7084fc52689b\") " Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.275725 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs" (OuterVolumeSpecName: "logs") pod "f130ecd2-4428-4f1e-a386-7084fc52689b" (UID: "f130ecd2-4428-4f1e-a386-7084fc52689b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.283617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7" (OuterVolumeSpecName: "kube-api-access-w66w7") pod "f130ecd2-4428-4f1e-a386-7084fc52689b" (UID: "f130ecd2-4428-4f1e-a386-7084fc52689b"). InnerVolumeSpecName "kube-api-access-w66w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.283796 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts" (OuterVolumeSpecName: "scripts") pod "f130ecd2-4428-4f1e-a386-7084fc52689b" (UID: "f130ecd2-4428-4f1e-a386-7084fc52689b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.319779 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f130ecd2-4428-4f1e-a386-7084fc52689b" (UID: "f130ecd2-4428-4f1e-a386-7084fc52689b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.321693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data" (OuterVolumeSpecName: "config-data") pod "f130ecd2-4428-4f1e-a386-7084fc52689b" (UID: "f130ecd2-4428-4f1e-a386-7084fc52689b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.376312 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w66w7\" (UniqueName: \"kubernetes.io/projected/f130ecd2-4428-4f1e-a386-7084fc52689b-kube-api-access-w66w7\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.376537 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.376594 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.376647 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f130ecd2-4428-4f1e-a386-7084fc52689b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.376707 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f130ecd2-4428-4f1e-a386-7084fc52689b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.831781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rjqfw" event={"ID":"f130ecd2-4428-4f1e-a386-7084fc52689b","Type":"ContainerDied","Data":"99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370"} Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.832234 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99c046c0d394ec6d9256c715e6467935a9f88f14c2b631d4ec280cb1ed7e1370" Dec 03 22:27:16 crc kubenswrapper[4830]: I1203 22:27:16.831815 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rjqfw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.252391 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6c66b45784-w46xw"] Dec 03 22:27:17 crc kubenswrapper[4830]: E1203 22:27:17.253049 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f130ecd2-4428-4f1e-a386-7084fc52689b" containerName="placement-db-sync" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.253061 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f130ecd2-4428-4f1e-a386-7084fc52689b" containerName="placement-db-sync" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.253264 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f130ecd2-4428-4f1e-a386-7084fc52689b" containerName="placement-db-sync" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.254275 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.256175 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.256345 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.256473 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.256588 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vcrcr" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.258333 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.265597 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c66b45784-w46xw"] Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-internal-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-config-data\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393321 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-public-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393338 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9rb\" (UniqueName: \"kubernetes.io/projected/7956cb25-0b77-4822-b26e-dd512559f30b-kube-api-access-9b9rb\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393788 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-scripts\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.393855 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-combined-ca-bundle\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.394075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7956cb25-0b77-4822-b26e-dd512559f30b-logs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-internal-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495468 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-config-data\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-public-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495518 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9rb\" (UniqueName: \"kubernetes.io/projected/7956cb25-0b77-4822-b26e-dd512559f30b-kube-api-access-9b9rb\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-scripts\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495617 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-combined-ca-bundle\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.495683 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7956cb25-0b77-4822-b26e-dd512559f30b-logs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.496627 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7956cb25-0b77-4822-b26e-dd512559f30b-logs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.500122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-public-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.500734 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-combined-ca-bundle\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.502147 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-scripts\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.503491 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-internal-tls-certs\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.506271 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7956cb25-0b77-4822-b26e-dd512559f30b-config-data\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.513181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9rb\" (UniqueName: \"kubernetes.io/projected/7956cb25-0b77-4822-b26e-dd512559f30b-kube-api-access-9b9rb\") pod \"placement-6c66b45784-w46xw\" (UID: \"7956cb25-0b77-4822-b26e-dd512559f30b\") " pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:17 crc kubenswrapper[4830]: I1203 22:27:17.608580 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.138855 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310637 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64ckl\" (UniqueName: \"kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310732 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310810 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310913 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.310928 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle\") pod \"e6917fbd-195a-4b58-83e0-27988c69b57d\" (UID: \"e6917fbd-195a-4b58-83e0-27988c69b57d\") " Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.321937 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl" (OuterVolumeSpecName: "kube-api-access-64ckl") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "kube-api-access-64ckl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.325075 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.329810 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts" (OuterVolumeSpecName: "scripts") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.330931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.358566 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data" (OuterVolumeSpecName: "config-data") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.397318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6917fbd-195a-4b58-83e0-27988c69b57d" (UID: "e6917fbd-195a-4b58-83e0-27988c69b57d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.412958 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.412992 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.413013 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.413023 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.413034 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6917fbd-195a-4b58-83e0-27988c69b57d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.413042 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64ckl\" (UniqueName: \"kubernetes.io/projected/e6917fbd-195a-4b58-83e0-27988c69b57d-kube-api-access-64ckl\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:18 crc kubenswrapper[4830]: W1203 22:27:18.686780 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7956cb25_0b77_4822_b26e_dd512559f30b.slice/crio-579a79b0a3e681dbcbd40edb89b5128117977567183f14df57b4b7bc53115b77 WatchSource:0}: Error finding container 579a79b0a3e681dbcbd40edb89b5128117977567183f14df57b4b7bc53115b77: Status 404 returned error can't find the container with id 579a79b0a3e681dbcbd40edb89b5128117977567183f14df57b4b7bc53115b77 Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.688278 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c66b45784-w46xw"] Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.812729 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.854816 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l6rwx" event={"ID":"e6917fbd-195a-4b58-83e0-27988c69b57d","Type":"ContainerDied","Data":"03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b"} Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.854847 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03fa86efdc4d933799cebd0008b3e44e51d2bcd80b8c1f33b5181802cc9f348b" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.854902 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l6rwx" Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.876235 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-94wsw" event={"ID":"e08a2474-4282-4814-8d14-438d92f1c593","Type":"ContainerStarted","Data":"381efed19fedf1a9e8a7dbc42373574a640f624f97ff7a6dc20f1c961a645142"} Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.882999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c66b45784-w46xw" event={"ID":"7956cb25-0b77-4822-b26e-dd512559f30b","Type":"ContainerStarted","Data":"579a79b0a3e681dbcbd40edb89b5128117977567183f14df57b4b7bc53115b77"} Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.886355 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.887521 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="dnsmasq-dns" containerID="cri-o://172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea" gracePeriod=10 Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.906580 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerStarted","Data":"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0"} Dec 03 22:27:18 crc kubenswrapper[4830]: I1203 22:27:18.941464 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-94wsw" podStartSLOduration=2.845515846 podStartE2EDuration="51.941438655s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="2025-12-03 22:26:29.102974709 +0000 UTC m=+1278.099436058" lastFinishedPulling="2025-12-03 22:27:18.198897508 +0000 UTC m=+1327.195358867" observedRunningTime="2025-12-03 22:27:18.89728392 +0000 UTC m=+1327.893745269" watchObservedRunningTime="2025-12-03 22:27:18.941438655 +0000 UTC m=+1327.937900004" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.480133 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.507647 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79b896c7bd-gsfgm"] Dec 03 22:27:19 crc kubenswrapper[4830]: E1203 22:27:19.508557 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="init" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.508578 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="init" Dec 03 22:27:19 crc kubenswrapper[4830]: E1203 22:27:19.508610 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="dnsmasq-dns" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.508618 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="dnsmasq-dns" Dec 03 22:27:19 crc kubenswrapper[4830]: E1203 22:27:19.508646 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6917fbd-195a-4b58-83e0-27988c69b57d" containerName="keystone-bootstrap" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.508653 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6917fbd-195a-4b58-83e0-27988c69b57d" containerName="keystone-bootstrap" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.508977 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6917fbd-195a-4b58-83e0-27988c69b57d" containerName="keystone-bootstrap" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.508999 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerName="dnsmasq-dns" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.509920 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.521263 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.521491 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.521691 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.521892 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6cfjx" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.522021 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.532448 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79b896c7bd-gsfgm"] Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.532828 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655362 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655408 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq7z8\" (UniqueName: \"kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655628 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655675 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655711 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb\") pod \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\" (UID: \"6e96cd45-9150-437d-bbe7-4f3df3c40d1e\") " Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.655973 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-scripts\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656029 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-config-data\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-internal-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656104 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-credential-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656121 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvqx\" (UniqueName: \"kubernetes.io/projected/ca47aca8-81d5-4c28-b82b-147b7835a87d-kube-api-access-ztvqx\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656151 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-combined-ca-bundle\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656185 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-public-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.656215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-fernet-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.692173 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8" (OuterVolumeSpecName: "kube-api-access-jq7z8") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "kube-api-access-jq7z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.759752 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-config-data\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-internal-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760197 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-credential-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvqx\" (UniqueName: \"kubernetes.io/projected/ca47aca8-81d5-4c28-b82b-147b7835a87d-kube-api-access-ztvqx\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760252 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-combined-ca-bundle\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-public-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760328 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-fernet-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-scripts\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760429 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq7z8\" (UniqueName: \"kubernetes.io/projected/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-kube-api-access-jq7z8\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.760797 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.764457 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-credential-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.773930 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-internal-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.780139 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-combined-ca-bundle\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.791229 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-scripts\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.801079 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-fernet-keys\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.801661 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-config-data\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.806261 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca47aca8-81d5-4c28-b82b-147b7835a87d-public-tls-certs\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.807160 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.808822 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config" (OuterVolumeSpecName: "config") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.816898 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvqx\" (UniqueName: \"kubernetes.io/projected/ca47aca8-81d5-4c28-b82b-147b7835a87d-kube-api-access-ztvqx\") pod \"keystone-79b896c7bd-gsfgm\" (UID: \"ca47aca8-81d5-4c28-b82b-147b7835a87d\") " pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.863110 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.863303 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.863395 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.864426 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.883435 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e96cd45-9150-437d-bbe7-4f3df3c40d1e" (UID: "6e96cd45-9150-437d-bbe7-4f3df3c40d1e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.924009 4830 generic.go:334] "Generic (PLEG): container finished" podID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" containerID="172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea" exitCode=0 Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.924118 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.927953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" event={"ID":"6e96cd45-9150-437d-bbe7-4f3df3c40d1e","Type":"ContainerDied","Data":"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea"} Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.928108 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kfvth" event={"ID":"6e96cd45-9150-437d-bbe7-4f3df3c40d1e","Type":"ContainerDied","Data":"7e4e9b387ec50ad2866e0ddfa36298c653f31a60ad0e0d29f24db20959185ea7"} Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.928137 4830 scope.go:117] "RemoveContainer" containerID="172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.932313 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c66b45784-w46xw" event={"ID":"7956cb25-0b77-4822-b26e-dd512559f30b","Type":"ContainerStarted","Data":"b131e916cd1a982738e7a67e0d5f8d8e493116aa38afd0467813bffeffb6909f"} Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.958050 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.966427 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.966453 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e96cd45-9150-437d-bbe7-4f3df3c40d1e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.978567 4830 scope.go:117] "RemoveContainer" containerID="f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4" Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.989570 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:27:19 crc kubenswrapper[4830]: I1203 22:27:19.998281 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kfvth"] Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.051395 4830 scope.go:117] "RemoveContainer" containerID="172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea" Dec 03 22:27:20 crc kubenswrapper[4830]: E1203 22:27:20.052831 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea\": container with ID starting with 172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea not found: ID does not exist" containerID="172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea" Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.052861 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea"} err="failed to get container status \"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea\": rpc error: code = NotFound desc = could not find container \"172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea\": container with ID starting with 172f9dcecfd34e34b0cfe73acfc3c77f282b477ec803f715704e92f2871aefea not found: ID does not exist" Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.052886 4830 scope.go:117] "RemoveContainer" containerID="f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4" Dec 03 22:27:20 crc kubenswrapper[4830]: E1203 22:27:20.054872 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4\": container with ID starting with f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4 not found: ID does not exist" containerID="f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4" Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.054933 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4"} err="failed to get container status \"f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4\": rpc error: code = NotFound desc = could not find container \"f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4\": container with ID starting with f24cd8d83d5a4dd35076861b8baa3ac7317c0243bad2b4bb5aee3aa8e94b12d4 not found: ID does not exist" Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.538915 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79b896c7bd-gsfgm"] Dec 03 22:27:20 crc kubenswrapper[4830]: W1203 22:27:20.564760 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca47aca8_81d5_4c28_b82b_147b7835a87d.slice/crio-68111b50576eb1f294b606d2067fb404bc4db858e732d70bec61b289a4da7acd WatchSource:0}: Error finding container 68111b50576eb1f294b606d2067fb404bc4db858e732d70bec61b289a4da7acd: Status 404 returned error can't find the container with id 68111b50576eb1f294b606d2067fb404bc4db858e732d70bec61b289a4da7acd Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.952048 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c66b45784-w46xw" event={"ID":"7956cb25-0b77-4822-b26e-dd512559f30b","Type":"ContainerStarted","Data":"94f710a4c2ed370dd6f9bda9bdb32116b3ba7018883fca672131e630211f15ce"} Dec 03 22:27:20 crc kubenswrapper[4830]: I1203 22:27:20.953250 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79b896c7bd-gsfgm" event={"ID":"ca47aca8-81d5-4c28-b82b-147b7835a87d","Type":"ContainerStarted","Data":"68111b50576eb1f294b606d2067fb404bc4db858e732d70bec61b289a4da7acd"} Dec 03 22:27:21 crc kubenswrapper[4830]: I1203 22:27:21.348395 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e96cd45-9150-437d-bbe7-4f3df3c40d1e" path="/var/lib/kubelet/pods/6e96cd45-9150-437d-bbe7-4f3df3c40d1e/volumes" Dec 03 22:27:25 crc kubenswrapper[4830]: I1203 22:27:25.007568 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z5lqr" event={"ID":"a9793d86-3f26-4443-b740-c2bbcc65f58c","Type":"ContainerStarted","Data":"69904b4d6f7cc0c217f507c0d5363fac25eb8fd1c6609b4d4e245c400e6ca3e7"} Dec 03 22:27:26 crc kubenswrapper[4830]: I1203 22:27:26.018860 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79b896c7bd-gsfgm" event={"ID":"ca47aca8-81d5-4c28-b82b-147b7835a87d","Type":"ContainerStarted","Data":"7dffeeecd5a7db98ac68a4fbb135463d81b0c30f23a1a0514e79a9a867ba5360"} Dec 03 22:27:26 crc kubenswrapper[4830]: I1203 22:27:26.019129 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:26 crc kubenswrapper[4830]: I1203 22:27:26.019143 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:26 crc kubenswrapper[4830]: I1203 22:27:26.060378 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6c66b45784-w46xw" podStartSLOduration=9.060359931 podStartE2EDuration="9.060359931s" podCreationTimestamp="2025-12-03 22:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:26.049417875 +0000 UTC m=+1335.045879214" watchObservedRunningTime="2025-12-03 22:27:26.060359931 +0000 UTC m=+1335.056821280" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.039020 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-67qfl" event={"ID":"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7","Type":"ContainerStarted","Data":"3cc8cdda701438d35564c3e5d745b988294c619ef5feaa267b94019bbc58bc23"} Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.039460 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.064634 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-z5lqr" podStartSLOduration=9.927448728 podStartE2EDuration="1m0.064617011s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="2025-12-03 22:26:28.622875573 +0000 UTC m=+1277.619336922" lastFinishedPulling="2025-12-03 22:27:18.760043836 +0000 UTC m=+1327.756505205" observedRunningTime="2025-12-03 22:27:26.073122826 +0000 UTC m=+1335.069584185" watchObservedRunningTime="2025-12-03 22:27:27.064617011 +0000 UTC m=+1336.061078360" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.066691 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79b896c7bd-gsfgm" podStartSLOduration=8.066681277 podStartE2EDuration="8.066681277s" podCreationTimestamp="2025-12-03 22:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:27.063196773 +0000 UTC m=+1336.059658132" watchObservedRunningTime="2025-12-03 22:27:27.066681277 +0000 UTC m=+1336.063142626" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.089196 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-67qfl" podStartSLOduration=2.198758427 podStartE2EDuration="1m0.089178766s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="2025-12-03 22:26:28.679102272 +0000 UTC m=+1277.675563621" lastFinishedPulling="2025-12-03 22:27:26.569522611 +0000 UTC m=+1335.565983960" observedRunningTime="2025-12-03 22:27:27.087633274 +0000 UTC m=+1336.084094613" watchObservedRunningTime="2025-12-03 22:27:27.089178766 +0000 UTC m=+1336.085640115" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.759924 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:27 crc kubenswrapper[4830]: I1203 22:27:27.762239 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c66b45784-w46xw" Dec 03 22:27:31 crc kubenswrapper[4830]: I1203 22:27:31.081611 4830 generic.go:334] "Generic (PLEG): container finished" podID="e08a2474-4282-4814-8d14-438d92f1c593" containerID="381efed19fedf1a9e8a7dbc42373574a640f624f97ff7a6dc20f1c961a645142" exitCode=0 Dec 03 22:27:31 crc kubenswrapper[4830]: I1203 22:27:31.081857 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-94wsw" event={"ID":"e08a2474-4282-4814-8d14-438d92f1c593","Type":"ContainerDied","Data":"381efed19fedf1a9e8a7dbc42373574a640f624f97ff7a6dc20f1c961a645142"} Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.094109 4830 generic.go:334] "Generic (PLEG): container finished" podID="a9793d86-3f26-4443-b740-c2bbcc65f58c" containerID="69904b4d6f7cc0c217f507c0d5363fac25eb8fd1c6609b4d4e245c400e6ca3e7" exitCode=0 Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.094200 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z5lqr" event={"ID":"a9793d86-3f26-4443-b740-c2bbcc65f58c","Type":"ContainerDied","Data":"69904b4d6f7cc0c217f507c0d5363fac25eb8fd1c6609b4d4e245c400e6ca3e7"} Dec 03 22:27:32 crc kubenswrapper[4830]: E1203 22:27:32.416908 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4f0abb0_964c_42d5_8a2d_2cdf84d049c7.slice/crio-conmon-3cc8cdda701438d35564c3e5d745b988294c619ef5feaa267b94019bbc58bc23.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.526687 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-94wsw" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.653719 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data\") pod \"e08a2474-4282-4814-8d14-438d92f1c593\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.653807 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle\") pod \"e08a2474-4282-4814-8d14-438d92f1c593\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.653990 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgs9l\" (UniqueName: \"kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l\") pod \"e08a2474-4282-4814-8d14-438d92f1c593\" (UID: \"e08a2474-4282-4814-8d14-438d92f1c593\") " Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.660225 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e08a2474-4282-4814-8d14-438d92f1c593" (UID: "e08a2474-4282-4814-8d14-438d92f1c593"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.662992 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l" (OuterVolumeSpecName: "kube-api-access-hgs9l") pod "e08a2474-4282-4814-8d14-438d92f1c593" (UID: "e08a2474-4282-4814-8d14-438d92f1c593"). InnerVolumeSpecName "kube-api-access-hgs9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.699785 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08a2474-4282-4814-8d14-438d92f1c593" (UID: "e08a2474-4282-4814-8d14-438d92f1c593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.756881 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.756914 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08a2474-4282-4814-8d14-438d92f1c593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:32 crc kubenswrapper[4830]: I1203 22:27:32.756927 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgs9l\" (UniqueName: \"kubernetes.io/projected/e08a2474-4282-4814-8d14-438d92f1c593-kube-api-access-hgs9l\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112384 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerStarted","Data":"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1"} Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112638 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-central-agent" containerID="cri-o://5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056" gracePeriod=30 Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112785 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112820 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="proxy-httpd" containerID="cri-o://1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1" gracePeriod=30 Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112877 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="sg-core" containerID="cri-o://96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0" gracePeriod=30 Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.112918 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-notification-agent" containerID="cri-o://fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922" gracePeriod=30 Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.126269 4830 generic.go:334] "Generic (PLEG): container finished" podID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" containerID="3cc8cdda701438d35564c3e5d745b988294c619ef5feaa267b94019bbc58bc23" exitCode=0 Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.126373 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-67qfl" event={"ID":"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7","Type":"ContainerDied","Data":"3cc8cdda701438d35564c3e5d745b988294c619ef5feaa267b94019bbc58bc23"} Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.156306 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-94wsw" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.155950 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-94wsw" event={"ID":"e08a2474-4282-4814-8d14-438d92f1c593","Type":"ContainerDied","Data":"f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206"} Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.161488 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11e1a27f24aa8e4e232388dd01b2a20dcb5f1ac13283e7b834d653210fb5206" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.189839 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.82813775 podStartE2EDuration="1m6.189820262s" podCreationTimestamp="2025-12-03 22:26:27 +0000 UTC" firstStartedPulling="2025-12-03 22:26:28.659720965 +0000 UTC m=+1277.656182314" lastFinishedPulling="2025-12-03 22:27:32.021403467 +0000 UTC m=+1341.017864826" observedRunningTime="2025-12-03 22:27:33.15577851 +0000 UTC m=+1342.152239859" watchObservedRunningTime="2025-12-03 22:27:33.189820262 +0000 UTC m=+1342.186281621" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.524078 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c48ccfdc-rn6l7"] Dec 03 22:27:33 crc kubenswrapper[4830]: E1203 22:27:33.524706 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08a2474-4282-4814-8d14-438d92f1c593" containerName="barbican-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.524718 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08a2474-4282-4814-8d14-438d92f1c593" containerName="barbican-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.524905 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08a2474-4282-4814-8d14-438d92f1c593" containerName="barbican-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.525906 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.534865 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.535095 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.554155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n2pqj" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.577510 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8f8d56fd8-kd4r4"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.579096 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.584893 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.607893 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c48ccfdc-rn6l7"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.661690 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8f8d56fd8-kd4r4"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.678147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data-custom\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.678222 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.678288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f622ae76-ce43-4025-90eb-e609fbe2a004-logs\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.678308 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfkk\" (UniqueName: \"kubernetes.io/projected/f622ae76-ce43-4025-90eb-e609fbe2a004-kube-api-access-dhfkk\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.678355 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-combined-ca-bundle\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.698573 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.700432 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.764787 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779007 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779832 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779884 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-combined-ca-bundle\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779908 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data-custom\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779949 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56894369-6e4d-451e-b510-60c1cad4b111-logs\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f622ae76-ce43-4025-90eb-e609fbe2a004-logs\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.779992 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfkk\" (UniqueName: \"kubernetes.io/projected/f622ae76-ce43-4025-90eb-e609fbe2a004-kube-api-access-dhfkk\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.780042 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-combined-ca-bundle\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.780071 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.780106 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5pc9\" (UniqueName: \"kubernetes.io/projected/56894369-6e4d-451e-b510-60c1cad4b111-kube-api-access-g5pc9\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.780124 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data-custom\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.781985 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f622ae76-ce43-4025-90eb-e609fbe2a004-logs\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.788369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-combined-ca-bundle\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.789452 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.795200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f622ae76-ce43-4025-90eb-e609fbe2a004-config-data-custom\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.830766 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfkk\" (UniqueName: \"kubernetes.io/projected/f622ae76-ce43-4025-90eb-e609fbe2a004-kube-api-access-dhfkk\") pod \"barbican-worker-7c48ccfdc-rn6l7\" (UID: \"f622ae76-ce43-4025-90eb-e609fbe2a004\") " pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.875084 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:33 crc kubenswrapper[4830]: E1203 22:27:33.875457 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" containerName="cinder-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.875469 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" containerName="cinder-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.875677 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" containerName="cinder-db-sync" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.876652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.883042 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886609 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886677 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886754 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886797 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.886997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgmn\" (UniqueName: \"kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn\") pod \"a9793d86-3f26-4443-b740-c2bbcc65f58c\" (UID: \"a9793d86-3f26-4443-b740-c2bbcc65f58c\") " Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887213 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-combined-ca-bundle\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887236 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data-custom\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56894369-6e4d-451e-b510-60c1cad4b111-logs\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887362 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887377 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887474 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5pc9\" (UniqueName: \"kubernetes.io/projected/56894369-6e4d-451e-b510-60c1cad4b111-kube-api-access-g5pc9\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.887499 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlnl\" (UniqueName: \"kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.895801 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts" (OuterVolumeSpecName: "scripts") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.897772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.898051 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.898884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56894369-6e4d-451e-b510-60c1cad4b111-logs\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.909740 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.910395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn" (OuterVolumeSpecName: "kube-api-access-7wgmn") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "kube-api-access-7wgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.910727 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-config-data-custom\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.924137 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56894369-6e4d-451e-b510-60c1cad4b111-combined-ca-bundle\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.924824 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5pc9\" (UniqueName: \"kubernetes.io/projected/56894369-6e4d-451e-b510-60c1cad4b111-kube-api-access-g5pc9\") pod \"barbican-keystone-listener-8f8d56fd8-kd4r4\" (UID: \"56894369-6e4d-451e-b510-60c1cad4b111\") " pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.929054 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.929891 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.936794 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.940750 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.988965 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlnl\" (UniqueName: \"kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989060 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8sr9\" (UniqueName: \"kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989198 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989269 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.989989 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990046 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990103 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9793d86-3f26-4443-b740-c2bbcc65f58c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990118 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990127 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990139 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990148 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgmn\" (UniqueName: \"kubernetes.io/projected/a9793d86-3f26-4443-b740-c2bbcc65f58c-kube-api-access-7wgmn\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:33 crc kubenswrapper[4830]: I1203 22:27:33.990361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:33.990480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:33.990997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:33.991298 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:33.991443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.005386 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlnl\" (UniqueName: \"kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl\") pod \"dnsmasq-dns-85ff748b95-xqzjk\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.005774 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data" (OuterVolumeSpecName: "config-data") pod "a9793d86-3f26-4443-b740-c2bbcc65f58c" (UID: "a9793d86-3f26-4443-b740-c2bbcc65f58c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.060016 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8sr9\" (UniqueName: \"kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093414 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.093805 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9793d86-3f26-4443-b740-c2bbcc65f58c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.094231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.100913 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.101871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.108806 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.110975 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.111133 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8sr9\" (UniqueName: \"kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9\") pod \"barbican-api-765746756b-b9w2l\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.223163 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.241167 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z5lqr" event={"ID":"a9793d86-3f26-4443-b740-c2bbcc65f58c","Type":"ContainerDied","Data":"2a17ed687a3a72ba98e0d26dd5d66778270bf8ee1bd004597bc7266e8f52cab6"} Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.241207 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a17ed687a3a72ba98e0d26dd5d66778270bf8ee1bd004597bc7266e8f52cab6" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.241269 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z5lqr" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.259894 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerID="1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1" exitCode=0 Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.259930 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerID="96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0" exitCode=2 Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.259939 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerID="5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056" exitCode=0 Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.260163 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerDied","Data":"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1"} Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.260192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerDied","Data":"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0"} Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.260217 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerDied","Data":"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056"} Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.389586 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.391634 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.395933 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lllk2" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.396304 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.396422 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.396552 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.417561 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.448857 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.494373 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.508203 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515726 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlt7m\" (UniqueName: \"kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515816 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515971 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.515991 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.537732 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.589352 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c48ccfdc-rn6l7"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.621837 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwjpw\" (UniqueName: \"kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.621895 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.621939 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.621965 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlt7m\" (UniqueName: \"kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.621987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622024 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622072 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622164 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622194 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622294 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.622319 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.624015 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.628456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.629360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.629617 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.631315 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.634505 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.640200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.644035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlt7m\" (UniqueName: \"kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.648934 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.658185 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwjpw\" (UniqueName: \"kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736652 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736703 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736736 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736873 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.736943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.738636 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.750269 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.750677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.750822 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.750966 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.764633 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.788321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwjpw\" (UniqueName: \"kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw\") pod \"dnsmasq-dns-5c9776ccc5-slgkf\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wvm\" (UniqueName: \"kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842714 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.842739 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.878849 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.922685 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944512 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944684 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.944726 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wvm\" (UniqueName: \"kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.949800 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.949844 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.955797 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.956026 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.967200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.967408 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.974054 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wvm\" (UniqueName: \"kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm\") pod \"cinder-api-0\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " pod="openstack/cinder-api-0" Dec 03 22:27:34 crc kubenswrapper[4830]: I1203 22:27:34.997132 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8f8d56fd8-kd4r4"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.049586 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs\") pod \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.049634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts\") pod \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.049733 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle\") pod \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.049847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk946\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946\") pod \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.049903 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data\") pod \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\" (UID: \"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7\") " Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.059775 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs" (OuterVolumeSpecName: "certs") pod "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" (UID: "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.067600 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946" (OuterVolumeSpecName: "kube-api-access-mk946") pod "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" (UID: "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7"). InnerVolumeSpecName "kube-api-access-mk946". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.079906 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts" (OuterVolumeSpecName: "scripts") pod "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" (UID: "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.090229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data" (OuterVolumeSpecName: "config-data") pod "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" (UID: "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.092943 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" (UID: "d4f0abb0-964c-42d5-8a2d-2cdf84d049c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.102967 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.152021 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk946\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-kube-api-access-mk946\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.152068 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.152080 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.152090 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.152100 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.283080 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.303215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" event={"ID":"56894369-6e4d-451e-b510-60c1cad4b111","Type":"ContainerStarted","Data":"e6dd5c3df7f4178715a33a87be232cdbb3e60e661146c9d3410bebf750f1b83e"} Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.313174 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" event={"ID":"f622ae76-ce43-4025-90eb-e609fbe2a004","Type":"ContainerStarted","Data":"e26bac9012f79ff6731d7663b2c13f10f01df4dae5b74e05b39779e111702e22"} Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.365795 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-67qfl" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.385637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-67qfl" event={"ID":"d4f0abb0-964c-42d5-8a2d-2cdf84d049c7","Type":"ContainerDied","Data":"26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490"} Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.385674 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26dc414c40c3bfdb2e4e3f56ed34bf980fc69fc60e1196548e5649b4f4618490" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.385691 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-hvnzw"] Dec 03 22:27:35 crc kubenswrapper[4830]: E1203 22:27:35.386059 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" containerName="cloudkitty-db-sync" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.386073 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" containerName="cloudkitty-db-sync" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.387197 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" containerName="cloudkitty-db-sync" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.387942 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.388365 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-hvnzw"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.392392 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.392931 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.393389 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pzjnl" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.393603 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.393776 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.451953 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.573926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.574200 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.574265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzns\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.574464 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.574564 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.626752 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.636217 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.676451 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.676506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.676608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.676632 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.676674 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzns\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.686161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.686557 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.686995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.696221 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzns\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:35 crc kubenswrapper[4830]: I1203 22:27:35.702124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle\") pod \"cloudkitty-storageinit-hvnzw\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.065318 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.143830 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57d569655f-t92g7" Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.164868 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.261403 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.261621 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffd7655fb-lgwcj" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-api" containerID="cri-o://81a9c81b1d68fa3cbb7506f86caed908a6ae86e2a155eac519d103ea8d8850bd" gracePeriod=30 Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.267724 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffd7655fb-lgwcj" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-httpd" containerID="cri-o://d2c625c0c678eff611eb1a6c14b34de1c71c8e5eca7ac7a7b00fa0643798dc98" gracePeriod=30 Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.411013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" event={"ID":"dfbf712d-b734-4678-b329-b0a5e6ef0466","Type":"ContainerStarted","Data":"47fb9fb5fe0f8ce64e252f41aacf6212e61f8aeae26b7842ec3f050f464eb530"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.418780 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerStarted","Data":"4bd8d9b39277504fb4e7a2ba6d8ae31225fa9bf24f7f22f52b93d8b676bcaf89"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.440574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" event={"ID":"11e96c53-2efc-403c-b5c4-9dbc38104dc8","Type":"ContainerStarted","Data":"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.440616 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" event={"ID":"11e96c53-2efc-403c-b5c4-9dbc38104dc8","Type":"ContainerStarted","Data":"2583eee12836f1a86aa5cf0f5e1cb8a111b62ae3e030f7b1edfbfd8e2b5bf76c"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.455702 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerStarted","Data":"502cbf317e47edae675fc6102384397ea2ba6836692e23f8925f15492b1c6c33"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.465831 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerStarted","Data":"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.465872 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerStarted","Data":"979923b565cf271822f3c1fb2af250c661d73d02b441f0d5917add4f9eac457b"} Dec 03 22:27:36 crc kubenswrapper[4830]: I1203 22:27:36.860262 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-hvnzw"] Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.424284 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.530710 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.538044 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-hvnzw" event={"ID":"5a4d391d-0a89-417b-b548-b4754e4dcc99","Type":"ContainerStarted","Data":"d9d5f16469b8123bc6499d5077a96e2a9678fe79a93885eead62c47e897619f7"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.538079 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-hvnzw" event={"ID":"5a4d391d-0a89-417b-b548-b4754e4dcc99","Type":"ContainerStarted","Data":"118f6c6d207733b091326073a1e19d627194294647c79f462a367de6363d4935"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.547665 4830 generic.go:334] "Generic (PLEG): container finished" podID="11e96c53-2efc-403c-b5c4-9dbc38104dc8" containerID="95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4" exitCode=0 Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.547729 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" event={"ID":"11e96c53-2efc-403c-b5c4-9dbc38104dc8","Type":"ContainerDied","Data":"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.547753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" event={"ID":"11e96c53-2efc-403c-b5c4-9dbc38104dc8","Type":"ContainerDied","Data":"2583eee12836f1a86aa5cf0f5e1cb8a111b62ae3e030f7b1edfbfd8e2b5bf76c"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.547769 4830 scope.go:117] "RemoveContainer" containerID="95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.547883 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xqzjk" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.563733 4830 generic.go:334] "Generic (PLEG): container finished" podID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerID="d2c625c0c678eff611eb1a6c14b34de1c71c8e5eca7ac7a7b00fa0643798dc98" exitCode=0 Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.563795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerDied","Data":"d2c625c0c678eff611eb1a6c14b34de1c71c8e5eca7ac7a7b00fa0643798dc98"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.584831 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.584887 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.584997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlnl\" (UniqueName: \"kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.585052 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.585113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.585134 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.591769 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl" (OuterVolumeSpecName: "kube-api-access-mmlnl") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "kube-api-access-mmlnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.632071 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerID="fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922" exitCode=0 Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.632165 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerDied","Data":"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.632189 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f855a9c-02a9-47af-832c-4a48c1cb26ff","Type":"ContainerDied","Data":"34fc9dc82e96eac43a58a72e73553400a113cd6dcd5ff9886fe64e582f3af392"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.632217 4830 scope.go:117] "RemoveContainer" containerID="95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.632302 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.634278 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4\": container with ID starting with 95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4 not found: ID does not exist" containerID="95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.634323 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4"} err="failed to get container status \"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4\": rpc error: code = NotFound desc = could not find container \"95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4\": container with ID starting with 95b2910f5bf0702df1de8a4d8797fb036ee5be500f60cdc3d68daadb54cc87a4 not found: ID does not exist" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.634347 4830 scope.go:117] "RemoveContainer" containerID="1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.643466 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-hvnzw" podStartSLOduration=2.64344568 podStartE2EDuration="2.64344568s" podCreationTimestamp="2025-12-03 22:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:37.633732077 +0000 UTC m=+1346.630193436" watchObservedRunningTime="2025-12-03 22:27:37.64344568 +0000 UTC m=+1346.639907029" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.649262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerStarted","Data":"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.651689 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.651723 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.671493 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693121 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693260 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693294 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693461 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg247\" (UniqueName: \"kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.693659 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml\") pod \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\" (UID: \"7f855a9c-02a9-47af-832c-4a48c1cb26ff\") " Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.694122 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmlnl\" (UniqueName: \"kubernetes.io/projected/11e96c53-2efc-403c-b5c4-9dbc38104dc8-kube-api-access-mmlnl\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.694136 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.705655 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config" (OuterVolumeSpecName: "config") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.706035 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.707638 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-765746756b-b9w2l" podStartSLOduration=4.707026221 podStartE2EDuration="4.707026221s" podCreationTimestamp="2025-12-03 22:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:37.67778415 +0000 UTC m=+1346.674245499" watchObservedRunningTime="2025-12-03 22:27:37.707026221 +0000 UTC m=+1346.703487570" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.710139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.711599 4830 generic.go:334] "Generic (PLEG): container finished" podID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerID="9526c97d95eb71420a894402027a800a25845ade1461ad3ff7a52bc4831ae884" exitCode=0 Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.711636 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" event={"ID":"dfbf712d-b734-4678-b329-b0a5e6ef0466","Type":"ContainerDied","Data":"9526c97d95eb71420a894402027a800a25845ade1461ad3ff7a52bc4831ae884"} Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.713678 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.733124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts" (OuterVolumeSpecName: "scripts") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.733316 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb podName:11e96c53-2efc-403c-b5c4-9dbc38104dc8 nodeName:}" failed. No retries permitted until 2025-12-03 22:27:38.233291051 +0000 UTC m=+1347.229752390 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8") : error deleting /var/lib/kubelet/pods/11e96c53-2efc-403c-b5c4-9dbc38104dc8/volume-subpaths: remove /var/lib/kubelet/pods/11e96c53-2efc-403c-b5c4-9dbc38104dc8/volume-subpaths: no such file or directory Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.733553 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247" (OuterVolumeSpecName: "kube-api-access-qg247") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "kube-api-access-qg247". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.740372 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796033 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796073 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg247\" (UniqueName: \"kubernetes.io/projected/7f855a9c-02a9-47af-832c-4a48c1cb26ff-kube-api-access-qg247\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796085 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796096 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796109 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796119 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.796129 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f855a9c-02a9-47af-832c-4a48c1cb26ff-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.802640 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.831913 4830 scope.go:117] "RemoveContainer" containerID="96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.893783 4830 scope.go:117] "RemoveContainer" containerID="fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.897875 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.905622 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data" (OuterVolumeSpecName: "config-data") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.924307 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f855a9c-02a9-47af-832c-4a48c1cb26ff" (UID: "7f855a9c-02a9-47af-832c-4a48c1cb26ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.940081 4830 scope.go:117] "RemoveContainer" containerID="5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.976314 4830 scope.go:117] "RemoveContainer" containerID="1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.980021 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1\": container with ID starting with 1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1 not found: ID does not exist" containerID="1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.980070 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1"} err="failed to get container status \"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1\": rpc error: code = NotFound desc = could not find container \"1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1\": container with ID starting with 1b32f5c759bca28340736d70c6fe80b6e82413f7ab7d4caac8351c0c4f9097d1 not found: ID does not exist" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.980102 4830 scope.go:117] "RemoveContainer" containerID="96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.980983 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0\": container with ID starting with 96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0 not found: ID does not exist" containerID="96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.981011 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0"} err="failed to get container status \"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0\": rpc error: code = NotFound desc = could not find container \"96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0\": container with ID starting with 96b28956ebfc89ca9bd21bfe34673a35014e58292ef7689906a290d5c832d3e0 not found: ID does not exist" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.981029 4830 scope.go:117] "RemoveContainer" containerID="fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.982185 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922\": container with ID starting with fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922 not found: ID does not exist" containerID="fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.982222 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922"} err="failed to get container status \"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922\": rpc error: code = NotFound desc = could not find container \"fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922\": container with ID starting with fd629ffa6a26f68b8430e1a02aa47f0b7699d5cc28619cb2db155e0a1025c922 not found: ID does not exist" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.982241 4830 scope.go:117] "RemoveContainer" containerID="5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056" Dec 03 22:27:37 crc kubenswrapper[4830]: E1203 22:27:37.983276 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056\": container with ID starting with 5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056 not found: ID does not exist" containerID="5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056" Dec 03 22:27:37 crc kubenswrapper[4830]: I1203 22:27:37.983302 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056"} err="failed to get container status \"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056\": rpc error: code = NotFound desc = could not find container \"5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056\": container with ID starting with 5b7d9a5a1b231a38b7314bfcc5d8e3387012ef2143ea48b2d118fdc7285a8056 not found: ID does not exist" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.006798 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.006829 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f855a9c-02a9-47af-832c-4a48c1cb26ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.015042 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.050303 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.082440 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:38 crc kubenswrapper[4830]: E1203 22:27:38.082924 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e96c53-2efc-403c-b5c4-9dbc38104dc8" containerName="init" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.082936 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e96c53-2efc-403c-b5c4-9dbc38104dc8" containerName="init" Dec 03 22:27:38 crc kubenswrapper[4830]: E1203 22:27:38.082948 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-notification-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.082954 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-notification-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: E1203 22:27:38.082968 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="proxy-httpd" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.082974 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="proxy-httpd" Dec 03 22:27:38 crc kubenswrapper[4830]: E1203 22:27:38.082996 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="sg-core" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083002 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="sg-core" Dec 03 22:27:38 crc kubenswrapper[4830]: E1203 22:27:38.083016 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-central-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083022 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-central-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083200 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e96c53-2efc-403c-b5c4-9dbc38104dc8" containerName="init" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083214 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-notification-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083230 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="sg-core" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083243 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="ceilometer-central-agent" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.083255 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" containerName="proxy-httpd" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.085162 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.100000 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.100188 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.136304 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.230765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231114 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2bw\" (UniqueName: \"kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231244 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231284 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.231303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") pod \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\" (UID: \"11e96c53-2efc-403c-b5c4-9dbc38104dc8\") " Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339394 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2bw\" (UniqueName: \"kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339463 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339485 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339504 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339561 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339577 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.339632 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.366677 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11e96c53-2efc-403c-b5c4-9dbc38104dc8" (UID: "11e96c53-2efc-403c-b5c4-9dbc38104dc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.367792 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.375029 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.377239 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.382064 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.396312 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.401741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.414320 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2bw\" (UniqueName: \"kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw\") pod \"ceilometer-0\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.424358 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.446837 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11e96c53-2efc-403c-b5c4-9dbc38104dc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.483306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.780379 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerStarted","Data":"a0f3664d73852c1599bfd848ac0d5736f6628cdcfa3a7687c7ba70ae53381723"} Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.819287 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.827709 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xqzjk"] Dec 03 22:27:38 crc kubenswrapper[4830]: I1203 22:27:38.851761 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerStarted","Data":"683f9bf4b814a652d9aca053be1b7d65a9056b811a5eab3c61da9320e566b5bf"} Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.368625 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e96c53-2efc-403c-b5c4-9dbc38104dc8" path="/var/lib/kubelet/pods/11e96c53-2efc-403c-b5c4-9dbc38104dc8/volumes" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.369548 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f855a9c-02a9-47af-832c-4a48c1cb26ff" path="/var/lib/kubelet/pods/7f855a9c-02a9-47af-832c-4a48c1cb26ff/volumes" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.370268 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.528261 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.530037 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.573206 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.885924 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" event={"ID":"dfbf712d-b734-4678-b329-b0a5e6ef0466","Type":"ContainerStarted","Data":"eb2a93e12f2a45c937dbace85f3a555f346341d862829c4f8bbf76a5ab00eed1"} Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.886470 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.889329 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.894631 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerStarted","Data":"345c7b89c187fe2c437e57f278f6a6ee7bf21687ef16375f713e0450e0f4cb68"} Dec 03 22:27:39 crc kubenswrapper[4830]: I1203 22:27:39.911791 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" podStartSLOduration=5.911777913 podStartE2EDuration="5.911777913s" podCreationTimestamp="2025-12-03 22:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:39.909881581 +0000 UTC m=+1348.906342930" watchObservedRunningTime="2025-12-03 22:27:39.911777913 +0000 UTC m=+1348.908239262" Dec 03 22:27:40 crc kubenswrapper[4830]: W1203 22:27:40.676772 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d44c909_f792_4543_8f3e_a168e708be4f.slice/crio-6703fce322ea6b53dc5de42b41bffb13ab46863f5f942cd75b26b417bf522cb9 WatchSource:0}: Error finding container 6703fce322ea6b53dc5de42b41bffb13ab46863f5f942cd75b26b417bf522cb9: Status 404 returned error can't find the container with id 6703fce322ea6b53dc5de42b41bffb13ab46863f5f942cd75b26b417bf522cb9 Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.940652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerStarted","Data":"6703fce322ea6b53dc5de42b41bffb13ab46863f5f942cd75b26b417bf522cb9"} Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.944411 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api-log" containerID="cri-o://683f9bf4b814a652d9aca053be1b7d65a9056b811a5eab3c61da9320e566b5bf" gracePeriod=30 Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.944846 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api" containerID="cri-o://345c7b89c187fe2c437e57f278f6a6ee7bf21687ef16375f713e0450e0f4cb68" gracePeriod=30 Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.945014 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerStarted","Data":"1a4cc9dfeb00f11da13382c5052e217d4c36fca59023273635c0aa029efac2a3"} Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.945040 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 22:27:40 crc kubenswrapper[4830]: I1203 22:27:40.974453 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.974436214 podStartE2EDuration="6.974436214s" podCreationTimestamp="2025-12-03 22:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:40.972732308 +0000 UTC m=+1349.969193657" watchObservedRunningTime="2025-12-03 22:27:40.974436214 +0000 UTC m=+1349.970897563" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.018538 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6444ff6d8d-kq894"] Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.020966 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.028006 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.028817 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.929567305 podStartE2EDuration="7.028799926s" podCreationTimestamp="2025-12-03 22:27:34 +0000 UTC" firstStartedPulling="2025-12-03 22:27:35.846609208 +0000 UTC m=+1344.843070547" lastFinishedPulling="2025-12-03 22:27:36.945841819 +0000 UTC m=+1345.942303168" observedRunningTime="2025-12-03 22:27:41.012630658 +0000 UTC m=+1350.009092027" watchObservedRunningTime="2025-12-03 22:27:41.028799926 +0000 UTC m=+1350.025261275" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.029383 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.039830 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6444ff6d8d-kq894"] Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-internal-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138690 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcb6\" (UniqueName: \"kubernetes.io/projected/9e17dece-7a57-4c56-b128-0316add6808f-kube-api-access-nqcb6\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138722 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data-custom\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e17dece-7a57-4c56-b128-0316add6808f-logs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-combined-ca-bundle\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138862 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.138883 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-public-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240594 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e17dece-7a57-4c56-b128-0316add6808f-logs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240653 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-combined-ca-bundle\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-public-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-internal-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcb6\" (UniqueName: \"kubernetes.io/projected/9e17dece-7a57-4c56-b128-0316add6808f-kube-api-access-nqcb6\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.240852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data-custom\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.241937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e17dece-7a57-4c56-b128-0316add6808f-logs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.247403 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data-custom\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.248156 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-config-data\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.249136 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-combined-ca-bundle\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.257235 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-public-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.261057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e17dece-7a57-4c56-b128-0316add6808f-internal-tls-certs\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.267273 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcb6\" (UniqueName: \"kubernetes.io/projected/9e17dece-7a57-4c56-b128-0316add6808f-kube-api-access-nqcb6\") pod \"barbican-api-6444ff6d8d-kq894\" (UID: \"9e17dece-7a57-4c56-b128-0316add6808f\") " pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:41 crc kubenswrapper[4830]: I1203 22:27:41.389894 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.013106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" event={"ID":"56894369-6e4d-451e-b510-60c1cad4b111","Type":"ContainerStarted","Data":"5ecf71b516d7e4ce33f863ba0750b090bed06a17aa0aaedbabb9fc1c5e7862f1"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.036108 4830 generic.go:334] "Generic (PLEG): container finished" podID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerID="345c7b89c187fe2c437e57f278f6a6ee7bf21687ef16375f713e0450e0f4cb68" exitCode=0 Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.036148 4830 generic.go:334] "Generic (PLEG): container finished" podID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerID="683f9bf4b814a652d9aca053be1b7d65a9056b811a5eab3c61da9320e566b5bf" exitCode=143 Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.036200 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerDied","Data":"345c7b89c187fe2c437e57f278f6a6ee7bf21687ef16375f713e0450e0f4cb68"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.036229 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerDied","Data":"683f9bf4b814a652d9aca053be1b7d65a9056b811a5eab3c61da9320e566b5bf"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.047760 4830 generic.go:334] "Generic (PLEG): container finished" podID="5a4d391d-0a89-417b-b548-b4754e4dcc99" containerID="d9d5f16469b8123bc6499d5077a96e2a9678fe79a93885eead62c47e897619f7" exitCode=0 Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.047931 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-hvnzw" event={"ID":"5a4d391d-0a89-417b-b548-b4754e4dcc99","Type":"ContainerDied","Data":"d9d5f16469b8123bc6499d5077a96e2a9678fe79a93885eead62c47e897619f7"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.048344 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.064449 4830 generic.go:334] "Generic (PLEG): container finished" podID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerID="81a9c81b1d68fa3cbb7506f86caed908a6ae86e2a155eac519d103ea8d8850bd" exitCode=0 Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.064552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerDied","Data":"81a9c81b1d68fa3cbb7506f86caed908a6ae86e2a155eac519d103ea8d8850bd"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.069838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" event={"ID":"f622ae76-ce43-4025-90eb-e609fbe2a004","Type":"ContainerStarted","Data":"6e20cb461e1edfa780702b237a737e4da804bfaedca9418580b4216b58694028"} Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.149640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6444ff6d8d-kq894"] Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185627 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185683 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185769 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185857 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185886 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185948 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wvm\" (UniqueName: \"kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm\") pod \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\" (UID: \"7e13f2fa-e70c-48c0-bf3a-03f436d88777\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.185751 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.188934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs" (OuterVolumeSpecName: "logs") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.195899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm" (OuterVolumeSpecName: "kube-api-access-s6wvm") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "kube-api-access-s6wvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.197134 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.199014 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts" (OuterVolumeSpecName: "scripts") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.288760 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e13f2fa-e70c-48c0-bf3a-03f436d88777-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.288791 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.288800 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.288809 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e13f2fa-e70c-48c0-bf3a-03f436d88777-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.288818 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6wvm\" (UniqueName: \"kubernetes.io/projected/7e13f2fa-e70c-48c0-bf3a-03f436d88777-kube-api-access-s6wvm\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.393628 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.398425 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.435709 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data" (OuterVolumeSpecName: "config-data") pod "7e13f2fa-e70c-48c0-bf3a-03f436d88777" (UID: "7e13f2fa-e70c-48c0-bf3a-03f436d88777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.491242 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs\") pod \"f9dee835-249e-46b9-ab11-44b52c0514c4\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.491358 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5p58\" (UniqueName: \"kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58\") pod \"f9dee835-249e-46b9-ab11-44b52c0514c4\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.491384 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle\") pod \"f9dee835-249e-46b9-ab11-44b52c0514c4\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.491441 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config\") pod \"f9dee835-249e-46b9-ab11-44b52c0514c4\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.491498 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config\") pod \"f9dee835-249e-46b9-ab11-44b52c0514c4\" (UID: \"f9dee835-249e-46b9-ab11-44b52c0514c4\") " Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.492111 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.492136 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e13f2fa-e70c-48c0-bf3a-03f436d88777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.497711 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58" (OuterVolumeSpecName: "kube-api-access-g5p58") pod "f9dee835-249e-46b9-ab11-44b52c0514c4" (UID: "f9dee835-249e-46b9-ab11-44b52c0514c4"). InnerVolumeSpecName "kube-api-access-g5p58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.500725 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f9dee835-249e-46b9-ab11-44b52c0514c4" (UID: "f9dee835-249e-46b9-ab11-44b52c0514c4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.562417 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9dee835-249e-46b9-ab11-44b52c0514c4" (UID: "f9dee835-249e-46b9-ab11-44b52c0514c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.566719 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config" (OuterVolumeSpecName: "config") pod "f9dee835-249e-46b9-ab11-44b52c0514c4" (UID: "f9dee835-249e-46b9-ab11-44b52c0514c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.595285 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f9dee835-249e-46b9-ab11-44b52c0514c4" (UID: "f9dee835-249e-46b9-ab11-44b52c0514c4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.609811 4830 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.609847 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5p58\" (UniqueName: \"kubernetes.io/projected/f9dee835-249e-46b9-ab11-44b52c0514c4-kube-api-access-g5p58\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.609860 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.609871 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:42 crc kubenswrapper[4830]: I1203 22:27:42.609879 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9dee835-249e-46b9-ab11-44b52c0514c4-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.079440 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerStarted","Data":"29dbdb75077dffbc723cbb540ba7c93101d6335227086da0f41bee27a7835d1f"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.079793 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerStarted","Data":"a5ec9370cccd68d3c38d3c9ea4404a08bf0587d4c2493dbbc5b9151554d09a82"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.081629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" event={"ID":"56894369-6e4d-451e-b510-60c1cad4b111","Type":"ContainerStarted","Data":"5fd68681b6724836ebe49b2df830658e5e26802c38148cff39d72b069fad55d1"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.085614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e13f2fa-e70c-48c0-bf3a-03f436d88777","Type":"ContainerDied","Data":"4bd8d9b39277504fb4e7a2ba6d8ae31225fa9bf24f7f22f52b93d8b676bcaf89"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.085646 4830 scope.go:117] "RemoveContainer" containerID="345c7b89c187fe2c437e57f278f6a6ee7bf21687ef16375f713e0450e0f4cb68" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.085745 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.094869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffd7655fb-lgwcj" event={"ID":"f9dee835-249e-46b9-ab11-44b52c0514c4","Type":"ContainerDied","Data":"505a350f57ae5f32c7453e484113d0e182005fb1a8b6cb6ed103a8df1507ff7a"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.094951 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffd7655fb-lgwcj" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.108149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" event={"ID":"f622ae76-ce43-4025-90eb-e609fbe2a004","Type":"ContainerStarted","Data":"6078707e0c6c30dfb4d2610ccd499b02896289511a0da1eb90a5bb31fd42486a"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.117730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6444ff6d8d-kq894" event={"ID":"9e17dece-7a57-4c56-b128-0316add6808f","Type":"ContainerStarted","Data":"21d52c95e9dfe960ef4ba56442b200f52770d61d271d70659e60995c734463ae"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.117764 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.117775 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6444ff6d8d-kq894" event={"ID":"9e17dece-7a57-4c56-b128-0316add6808f","Type":"ContainerStarted","Data":"d64dac32c4d97d72419981e1f2264bb8052e21eddef553d39c9140e4baa2391b"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.117786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6444ff6d8d-kq894" event={"ID":"9e17dece-7a57-4c56-b128-0316add6808f","Type":"ContainerStarted","Data":"796bd30e52891f21b152457ea928dbde584c7a737a603a8ae92a319932ead29a"} Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.117806 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.137902 4830 scope.go:117] "RemoveContainer" containerID="683f9bf4b814a652d9aca053be1b7d65a9056b811a5eab3c61da9320e566b5bf" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.145373 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8f8d56fd8-kd4r4" podStartSLOduration=3.613344649 podStartE2EDuration="10.14535226s" podCreationTimestamp="2025-12-03 22:27:33 +0000 UTC" firstStartedPulling="2025-12-03 22:27:34.998396551 +0000 UTC m=+1343.994857900" lastFinishedPulling="2025-12-03 22:27:41.530404162 +0000 UTC m=+1350.526865511" observedRunningTime="2025-12-03 22:27:43.130946061 +0000 UTC m=+1352.127407410" watchObservedRunningTime="2025-12-03 22:27:43.14535226 +0000 UTC m=+1352.141813609" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.168980 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.186940 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ffd7655fb-lgwcj"] Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.206776 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.219802 4830 scope.go:117] "RemoveContainer" containerID="d2c625c0c678eff611eb1a6c14b34de1c71c8e5eca7ac7a7b00fa0643798dc98" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.227690 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.251532 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:43 crc kubenswrapper[4830]: E1203 22:27:43.251988 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252001 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api" Dec 03 22:27:43 crc kubenswrapper[4830]: E1203 22:27:43.252028 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-api" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252034 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-api" Dec 03 22:27:43 crc kubenswrapper[4830]: E1203 22:27:43.252046 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-httpd" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252052 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-httpd" Dec 03 22:27:43 crc kubenswrapper[4830]: E1203 22:27:43.252072 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api-log" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252078 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api-log" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252282 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252297 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" containerName="cinder-api-log" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252305 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-api" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.252316 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" containerName="neutron-httpd" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.255982 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.258084 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.276796 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.277142 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.292002 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.292037 4830 scope.go:117] "RemoveContainer" containerID="81a9c81b1d68fa3cbb7506f86caed908a6ae86e2a155eac519d103ea8d8850bd" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.328162 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c48ccfdc-rn6l7" podStartSLOduration=3.308454249 podStartE2EDuration="10.328144028s" podCreationTimestamp="2025-12-03 22:27:33 +0000 UTC" firstStartedPulling="2025-12-03 22:27:34.514935287 +0000 UTC m=+1343.511396626" lastFinishedPulling="2025-12-03 22:27:41.534625066 +0000 UTC m=+1350.531086405" observedRunningTime="2025-12-03 22:27:43.220180356 +0000 UTC m=+1352.216641705" watchObservedRunningTime="2025-12-03 22:27:43.328144028 +0000 UTC m=+1352.324605377" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341264 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-public-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341343 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-scripts\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974ffdb3-a522-4b55-bdf0-b935f1378f20-logs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341413 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data-custom\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341431 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kx2n\" (UniqueName: \"kubernetes.io/projected/974ffdb3-a522-4b55-bdf0-b935f1378f20-kube-api-access-5kx2n\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341535 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.341566 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/974ffdb3-a522-4b55-bdf0-b935f1378f20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.353015 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6444ff6d8d-kq894" podStartSLOduration=3.352994 podStartE2EDuration="3.352994s" podCreationTimestamp="2025-12-03 22:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:43.254297459 +0000 UTC m=+1352.250758808" watchObservedRunningTime="2025-12-03 22:27:43.352994 +0000 UTC m=+1352.349455349" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.368114 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e13f2fa-e70c-48c0-bf3a-03f436d88777" path="/var/lib/kubelet/pods/7e13f2fa-e70c-48c0-bf3a-03f436d88777/volumes" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.368963 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9dee835-249e-46b9-ab11-44b52c0514c4" path="/var/lib/kubelet/pods/f9dee835-249e-46b9-ab11-44b52c0514c4/volumes" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.443996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kx2n\" (UniqueName: \"kubernetes.io/projected/974ffdb3-a522-4b55-bdf0-b935f1378f20-kube-api-access-5kx2n\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444201 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/974ffdb3-a522-4b55-bdf0-b935f1378f20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-public-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-scripts\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974ffdb3-a522-4b55-bdf0-b935f1378f20-logs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.444443 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data-custom\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.453768 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-public-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.453868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/974ffdb3-a522-4b55-bdf0-b935f1378f20-etc-machine-id\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.455885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974ffdb3-a522-4b55-bdf0-b935f1378f20-logs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.461903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-scripts\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.466934 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.468963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data-custom\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.472284 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-config-data\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.473338 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974ffdb3-a522-4b55-bdf0-b935f1378f20-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.475862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kx2n\" (UniqueName: \"kubernetes.io/projected/974ffdb3-a522-4b55-bdf0-b935f1378f20-kube-api-access-5kx2n\") pod \"cinder-api-0\" (UID: \"974ffdb3-a522-4b55-bdf0-b935f1378f20\") " pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.562780 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.619092 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.647706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data\") pod \"5a4d391d-0a89-417b-b548-b4754e4dcc99\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.647779 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle\") pod \"5a4d391d-0a89-417b-b548-b4754e4dcc99\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.647829 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbzns\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns\") pod \"5a4d391d-0a89-417b-b548-b4754e4dcc99\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.647899 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs\") pod \"5a4d391d-0a89-417b-b548-b4754e4dcc99\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.648003 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts\") pod \"5a4d391d-0a89-417b-b548-b4754e4dcc99\" (UID: \"5a4d391d-0a89-417b-b548-b4754e4dcc99\") " Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.653297 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts" (OuterVolumeSpecName: "scripts") pod "5a4d391d-0a89-417b-b548-b4754e4dcc99" (UID: "5a4d391d-0a89-417b-b548-b4754e4dcc99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.655844 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs" (OuterVolumeSpecName: "certs") pod "5a4d391d-0a89-417b-b548-b4754e4dcc99" (UID: "5a4d391d-0a89-417b-b548-b4754e4dcc99"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.657716 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns" (OuterVolumeSpecName: "kube-api-access-zbzns") pod "5a4d391d-0a89-417b-b548-b4754e4dcc99" (UID: "5a4d391d-0a89-417b-b548-b4754e4dcc99"). InnerVolumeSpecName "kube-api-access-zbzns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.749299 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data" (OuterVolumeSpecName: "config-data") pod "5a4d391d-0a89-417b-b548-b4754e4dcc99" (UID: "5a4d391d-0a89-417b-b548-b4754e4dcc99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.750894 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.750925 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.750938 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.750947 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbzns\" (UniqueName: \"kubernetes.io/projected/5a4d391d-0a89-417b-b548-b4754e4dcc99-kube-api-access-zbzns\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.765696 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a4d391d-0a89-417b-b548-b4754e4dcc99" (UID: "5a4d391d-0a89-417b-b548-b4754e4dcc99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:43 crc kubenswrapper[4830]: I1203 22:27:43.852823 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a4d391d-0a89-417b-b548-b4754e4dcc99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.130393 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerStarted","Data":"5d352eddd42dae97311082ded3d862aa5ad5dc8ad15be9edb96e98677b5e62bf"} Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.136535 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-hvnzw" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.137152 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-hvnzw" event={"ID":"5a4d391d-0a89-417b-b548-b4754e4dcc99","Type":"ContainerDied","Data":"118f6c6d207733b091326073a1e19d627194294647c79f462a367de6363d4935"} Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.137185 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="118f6c6d207733b091326073a1e19d627194294647c79f462a367de6363d4935" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.159731 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 22:27:44 crc kubenswrapper[4830]: W1203 22:27:44.169487 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974ffdb3_a522_4b55_bdf0_b935f1378f20.slice/crio-1baf0b13ceb5d69dca44996a424f85ed36aa4edb15891b87710c26907021a609 WatchSource:0}: Error finding container 1baf0b13ceb5d69dca44996a424f85ed36aa4edb15891b87710c26907021a609: Status 404 returned error can't find the container with id 1baf0b13ceb5d69dca44996a424f85ed36aa4edb15891b87710c26907021a609 Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.327018 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:27:44 crc kubenswrapper[4830]: E1203 22:27:44.327640 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4d391d-0a89-417b-b548-b4754e4dcc99" containerName="cloudkitty-storageinit" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.327656 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4d391d-0a89-417b-b548-b4754e4dcc99" containerName="cloudkitty-storageinit" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.327872 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4d391d-0a89-417b-b548-b4754e4dcc99" containerName="cloudkitty-storageinit" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.328554 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.331848 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.332014 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pzjnl" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.332178 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.332278 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.334882 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.336582 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364376 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364466 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364631 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdn2\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364680 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.364738 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.470937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.471002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.471026 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdn2\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.471059 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.471096 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.471162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.479299 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.489803 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.490089 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.490210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.495305 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.512039 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdn2\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2\") pod \"cloudkitty-proc-0\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.532156 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.532677 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="dnsmasq-dns" containerID="cri-o://eb2a93e12f2a45c937dbace85f3a555f346341d862829c4f8bbf76a5ab00eed1" gracePeriod=10 Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.547631 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.571561 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.573301 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.603955 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.632628 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.634823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.637692 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.660321 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.676730 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.678834 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lgv\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.678930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.678955 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.678986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684580 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684725 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684750 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684831 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6lrv\" (UniqueName: \"kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684907 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.684959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.765607 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786790 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786831 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6lrv\" (UniqueName: \"kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786880 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lgv\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786959 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.786988 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.787013 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.787045 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.787085 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.787101 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.791978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.792285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.793141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.793651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.793885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.796522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.801242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.803958 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.804589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.806360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.817853 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6lrv\" (UniqueName: \"kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv\") pod \"dnsmasq-dns-67bdc55879-lrtz2\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.818346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:44 crc kubenswrapper[4830]: I1203 22:27:44.833307 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lgv\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv\") pod \"cloudkitty-api-0\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.017689 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.056791 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.172879 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.214176 4830 generic.go:334] "Generic (PLEG): container finished" podID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerID="eb2a93e12f2a45c937dbace85f3a555f346341d862829c4f8bbf76a5ab00eed1" exitCode=0 Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.214281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" event={"ID":"dfbf712d-b734-4678-b329-b0a5e6ef0466","Type":"ContainerDied","Data":"eb2a93e12f2a45c937dbace85f3a555f346341d862829c4f8bbf76a5ab00eed1"} Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.231716 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"974ffdb3-a522-4b55-bdf0-b935f1378f20","Type":"ContainerStarted","Data":"1baf0b13ceb5d69dca44996a424f85ed36aa4edb15891b87710c26907021a609"} Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.414645 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.416112 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516253 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516599 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516681 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516747 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.516860 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwjpw\" (UniqueName: \"kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw\") pod \"dfbf712d-b734-4678-b329-b0a5e6ef0466\" (UID: \"dfbf712d-b734-4678-b329-b0a5e6ef0466\") " Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.527611 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw" (OuterVolumeSpecName: "kube-api-access-wwjpw") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "kube-api-access-wwjpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.621013 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwjpw\" (UniqueName: \"kubernetes.io/projected/dfbf712d-b734-4678-b329-b0a5e6ef0466-kube-api-access-wwjpw\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.635234 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.646011 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config" (OuterVolumeSpecName: "config") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.658794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.665637 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.674972 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfbf712d-b734-4678-b329-b0a5e6ef0466" (UID: "dfbf712d-b734-4678-b329-b0a5e6ef0466"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.726397 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.726684 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.726777 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.726834 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.726894 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfbf712d-b734-4678-b329-b0a5e6ef0466-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:45 crc kubenswrapper[4830]: I1203 22:27:45.875167 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.005639 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:27:46 crc kubenswrapper[4830]: W1203 22:27:46.008978 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33def065_6580_4b21_b0e1_ebdf0897c741.slice/crio-04d257015d872c4947c0b5ad9e1ed6747d35a993f0e6d2281c0ad2c884962c7c WatchSource:0}: Error finding container 04d257015d872c4947c0b5ad9e1ed6747d35a993f0e6d2281c0ad2c884962c7c: Status 404 returned error can't find the container with id 04d257015d872c4947c0b5ad9e1ed6747d35a993f0e6d2281c0ad2c884962c7c Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.162583 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:46 crc kubenswrapper[4830]: W1203 22:27:46.162702 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffa483a_1441_426a_b6e4_655ed6286d69.slice/crio-de5b2cc75a1beaec1eacccfb550d9da47ec06482aed727acd85502f58e449358 WatchSource:0}: Error finding container de5b2cc75a1beaec1eacccfb550d9da47ec06482aed727acd85502f58e449358: Status 404 returned error can't find the container with id de5b2cc75a1beaec1eacccfb550d9da47ec06482aed727acd85502f58e449358 Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.265097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerStarted","Data":"84ab53eace32d0a8cf5f5fdfa3f4b165d7d59972ed98a35cb8a74c43902ac1cf"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.266455 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.275650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerStarted","Data":"de5b2cc75a1beaec1eacccfb550d9da47ec06482aed727acd85502f58e449358"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.277296 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" event={"ID":"33def065-6580-4b21-b0e1-ebdf0897c741","Type":"ContainerStarted","Data":"04d257015d872c4947c0b5ad9e1ed6747d35a993f0e6d2281c0ad2c884962c7c"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.280201 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" event={"ID":"dfbf712d-b734-4678-b329-b0a5e6ef0466","Type":"ContainerDied","Data":"47fb9fb5fe0f8ce64e252f41aacf6212e61f8aeae26b7842ec3f050f464eb530"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.280244 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.280278 4830 scope.go:117] "RemoveContainer" containerID="eb2a93e12f2a45c937dbace85f3a555f346341d862829c4f8bbf76a5ab00eed1" Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.282756 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"974ffdb3-a522-4b55-bdf0-b935f1378f20","Type":"ContainerStarted","Data":"ed1d8f66aedf123eb95b8653550d30fa236dfe9d3712b44d332b5f86b3a4822d"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.284619 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="cinder-scheduler" containerID="cri-o://a0f3664d73852c1599bfd848ac0d5736f6628cdcfa3a7687c7ba70ae53381723" gracePeriod=30 Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.284888 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"737c1ba4-5eac-4592-8e83-6d8263eb257c","Type":"ContainerStarted","Data":"2f61caab540b96ec885ee096948d7120a7c71055a2ccb7ca2b430261279d0dc3"} Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.284937 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="probe" containerID="cri-o://1a4cc9dfeb00f11da13382c5052e217d4c36fca59023273635c0aa029efac2a3" gracePeriod=30 Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.291658 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.836653554 podStartE2EDuration="8.291642035s" podCreationTimestamp="2025-12-03 22:27:38 +0000 UTC" firstStartedPulling="2025-12-03 22:27:41.422274875 +0000 UTC m=+1350.418736224" lastFinishedPulling="2025-12-03 22:27:44.877263356 +0000 UTC m=+1353.873724705" observedRunningTime="2025-12-03 22:27:46.286856927 +0000 UTC m=+1355.283318276" watchObservedRunningTime="2025-12-03 22:27:46.291642035 +0000 UTC m=+1355.288103384" Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.335795 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.347716 4830 scope.go:117] "RemoveContainer" containerID="9526c97d95eb71420a894402027a800a25845ade1461ad3ff7a52bc4831ae884" Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.350307 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-slgkf"] Dec 03 22:27:46 crc kubenswrapper[4830]: I1203 22:27:46.734704 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.301041 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerStarted","Data":"77dd6d83427be575ba7b43aad51baec490b9cf520be0b187ad4ddd9e68942eff"} Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.301282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerStarted","Data":"a7cd584d48951da4bfca5d07a7bc2093cbe2dd17af6de21f63845d343c63f1a0"} Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.303360 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.306797 4830 generic.go:334] "Generic (PLEG): container finished" podID="33def065-6580-4b21-b0e1-ebdf0897c741" containerID="bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1" exitCode=0 Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.306924 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" event={"ID":"33def065-6580-4b21-b0e1-ebdf0897c741","Type":"ContainerDied","Data":"bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1"} Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.315631 4830 generic.go:334] "Generic (PLEG): container finished" podID="24663030-2270-470f-875a-fcf06af8047d" containerID="1a4cc9dfeb00f11da13382c5052e217d4c36fca59023273635c0aa029efac2a3" exitCode=0 Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.315697 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerDied","Data":"1a4cc9dfeb00f11da13382c5052e217d4c36fca59023273635c0aa029efac2a3"} Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.327834 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"974ffdb3-a522-4b55-bdf0-b935f1378f20","Type":"ContainerStarted","Data":"0dfff69262ae1ee9ffe67cb76fb2abad1d754614c2dc3a3adf385665829a86c6"} Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.351751 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.351701377 podStartE2EDuration="3.351701377s" podCreationTimestamp="2025-12-03 22:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:47.321788027 +0000 UTC m=+1356.318249366" watchObservedRunningTime="2025-12-03 22:27:47.351701377 +0000 UTC m=+1356.348162716" Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.390433 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" path="/var/lib/kubelet/pods/dfbf712d-b734-4678-b329-b0a5e6ef0466/volumes" Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.402707 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.402687687 podStartE2EDuration="4.402687687s" podCreationTimestamp="2025-12-03 22:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:47.399833079 +0000 UTC m=+1356.396294448" watchObservedRunningTime="2025-12-03 22:27:47.402687687 +0000 UTC m=+1356.399149036" Dec 03 22:27:47 crc kubenswrapper[4830]: I1203 22:27:47.802303 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:48 crc kubenswrapper[4830]: I1203 22:27:48.093198 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:48 crc kubenswrapper[4830]: I1203 22:27:48.338753 4830 generic.go:334] "Generic (PLEG): container finished" podID="24663030-2270-470f-875a-fcf06af8047d" containerID="a0f3664d73852c1599bfd848ac0d5736f6628cdcfa3a7687c7ba70ae53381723" exitCode=0 Dec 03 22:27:48 crc kubenswrapper[4830]: I1203 22:27:48.339722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerDied","Data":"a0f3664d73852c1599bfd848ac0d5736f6628cdcfa3a7687c7ba70ae53381723"} Dec 03 22:27:48 crc kubenswrapper[4830]: I1203 22:27:48.339755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 22:27:48 crc kubenswrapper[4830]: I1203 22:27:48.903280 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.007885 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlt7m\" (UniqueName: \"kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.008909 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.009081 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.009184 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.009282 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.009344 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data\") pod \"24663030-2270-470f-875a-fcf06af8047d\" (UID: \"24663030-2270-470f-875a-fcf06af8047d\") " Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.010298 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.013250 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m" (OuterVolumeSpecName: "kube-api-access-zlt7m") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "kube-api-access-zlt7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.018977 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.020073 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts" (OuterVolumeSpecName: "scripts") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.067293 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.111094 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlt7m\" (UniqueName: \"kubernetes.io/projected/24663030-2270-470f-875a-fcf06af8047d-kube-api-access-zlt7m\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.111488 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.111501 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24663030-2270-470f-875a-fcf06af8047d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.111541 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.111553 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.161615 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data" (OuterVolumeSpecName: "config-data") pod "24663030-2270-470f-875a-fcf06af8047d" (UID: "24663030-2270-470f-875a-fcf06af8047d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.212754 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24663030-2270-470f-875a-fcf06af8047d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.350458 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.350519 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" event={"ID":"33def065-6580-4b21-b0e1-ebdf0897c741","Type":"ContainerStarted","Data":"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9"} Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.352658 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24663030-2270-470f-875a-fcf06af8047d","Type":"ContainerDied","Data":"502cbf317e47edae675fc6102384397ea2ba6836692e23f8925f15492b1c6c33"} Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.352714 4830 scope.go:117] "RemoveContainer" containerID="1a4cc9dfeb00f11da13382c5052e217d4c36fca59023273635c0aa029efac2a3" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.352818 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.358552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"737c1ba4-5eac-4592-8e83-6d8263eb257c","Type":"ContainerStarted","Data":"8eb8d04d323be9531430e60db5b7a09c9649781929f56a5c5029fcea6df97bea"} Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.359005 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api-log" containerID="cri-o://a7cd584d48951da4bfca5d07a7bc2093cbe2dd17af6de21f63845d343c63f1a0" gracePeriod=30 Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.359046 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api" containerID="cri-o://77dd6d83427be575ba7b43aad51baec490b9cf520be0b187ad4ddd9e68942eff" gracePeriod=30 Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.396153 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" podStartSLOduration=5.396138 podStartE2EDuration="5.396138s" podCreationTimestamp="2025-12-03 22:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:49.393367895 +0000 UTC m=+1358.389829244" watchObservedRunningTime="2025-12-03 22:27:49.396138 +0000 UTC m=+1358.392599349" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.401858 4830 scope.go:117] "RemoveContainer" containerID="a0f3664d73852c1599bfd848ac0d5736f6628cdcfa3a7687c7ba70ae53381723" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.424001 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.442474 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.473879 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:49 crc kubenswrapper[4830]: E1203 22:27:49.474479 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="cinder-scheduler" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.474564 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="cinder-scheduler" Dec 03 22:27:49 crc kubenswrapper[4830]: E1203 22:27:49.474647 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="init" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.474707 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="init" Dec 03 22:27:49 crc kubenswrapper[4830]: E1203 22:27:49.474762 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="dnsmasq-dns" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.474817 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="dnsmasq-dns" Dec 03 22:27:49 crc kubenswrapper[4830]: E1203 22:27:49.474885 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="probe" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.474937 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="probe" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.475171 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="dnsmasq-dns" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.475245 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="probe" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.475308 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="24663030-2270-470f-875a-fcf06af8047d" containerName="cinder-scheduler" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.484539 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.857788348 podStartE2EDuration="5.484521532s" podCreationTimestamp="2025-12-03 22:27:44 +0000 UTC" firstStartedPulling="2025-12-03 22:27:45.886781828 +0000 UTC m=+1354.883243177" lastFinishedPulling="2025-12-03 22:27:48.513515012 +0000 UTC m=+1357.509976361" observedRunningTime="2025-12-03 22:27:49.47853361 +0000 UTC m=+1358.474994959" watchObservedRunningTime="2025-12-03 22:27:49.484521532 +0000 UTC m=+1358.480982881" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.485123 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.487682 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521135 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499f12cf-14df-48d9-b2ee-11691c85e1ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521305 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521331 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521395 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plp8\" (UniqueName: \"kubernetes.io/projected/499f12cf-14df-48d9-b2ee-11691c85e1ed-kube-api-access-5plp8\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.521421 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.534644 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.540073 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623025 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623070 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plp8\" (UniqueName: \"kubernetes.io/projected/499f12cf-14df-48d9-b2ee-11691c85e1ed-kube-api-access-5plp8\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623133 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623193 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499f12cf-14df-48d9-b2ee-11691c85e1ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.623259 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499f12cf-14df-48d9-b2ee-11691c85e1ed-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.628238 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-scripts\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.631063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.632322 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.632901 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/499f12cf-14df-48d9-b2ee-11691c85e1ed-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.668101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plp8\" (UniqueName: \"kubernetes.io/projected/499f12cf-14df-48d9-b2ee-11691c85e1ed-kube-api-access-5plp8\") pod \"cinder-scheduler-0\" (UID: \"499f12cf-14df-48d9-b2ee-11691c85e1ed\") " pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.860874 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 22:27:49 crc kubenswrapper[4830]: I1203 22:27:49.881103 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-slgkf" podUID="dfbf712d-b734-4678-b329-b0a5e6ef0466" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.180:5353: i/o timeout" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.367076 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.414765 4830 generic.go:334] "Generic (PLEG): container finished" podID="dffa483a-1441-426a-b6e4-655ed6286d69" containerID="77dd6d83427be575ba7b43aad51baec490b9cf520be0b187ad4ddd9e68942eff" exitCode=0 Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.414801 4830 generic.go:334] "Generic (PLEG): container finished" podID="dffa483a-1441-426a-b6e4-655ed6286d69" containerID="a7cd584d48951da4bfca5d07a7bc2093cbe2dd17af6de21f63845d343c63f1a0" exitCode=143 Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.414892 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerDied","Data":"77dd6d83427be575ba7b43aad51baec490b9cf520be0b187ad4ddd9e68942eff"} Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.414920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerDied","Data":"a7cd584d48951da4bfca5d07a7bc2093cbe2dd17af6de21f63845d343c63f1a0"} Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.621697 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648209 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648320 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lgv\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648348 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648367 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648392 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs\") pod \"dffa483a-1441-426a-b6e4-655ed6286d69\" (UID: \"dffa483a-1441-426a-b6e4-655ed6286d69\") " Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.648726 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs" (OuterVolumeSpecName: "logs") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.683316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv" (OuterVolumeSpecName: "kube-api-access-m5lgv") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "kube-api-access-m5lgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.690121 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs" (OuterVolumeSpecName: "certs") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.690411 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.691758 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data" (OuterVolumeSpecName: "config-data") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.694636 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts" (OuterVolumeSpecName: "scripts") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.699608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dffa483a-1441-426a-b6e4-655ed6286d69" (UID: "dffa483a-1441-426a-b6e4-655ed6286d69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750305 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dffa483a-1441-426a-b6e4-655ed6286d69-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750335 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750345 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5lgv\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-kube-api-access-m5lgv\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750442 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750454 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750462 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dffa483a-1441-426a-b6e4-655ed6286d69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:50 crc kubenswrapper[4830]: I1203 22:27:50.750470 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/dffa483a-1441-426a-b6e4-655ed6286d69-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.356365 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24663030-2270-470f-875a-fcf06af8047d" path="/var/lib/kubelet/pods/24663030-2270-470f-875a-fcf06af8047d/volumes" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.455463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"499f12cf-14df-48d9-b2ee-11691c85e1ed","Type":"ContainerStarted","Data":"e845a306e3b3a8c61fdf06573d3fca5fa190412866c39f7c60afaebfbdb05db9"} Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.455815 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"499f12cf-14df-48d9-b2ee-11691c85e1ed","Type":"ContainerStarted","Data":"53ab61e4ba1ee37ef7f21d52b39559d3b867887368c045704b4d4709cb75feee"} Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.463301 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="737c1ba4-5eac-4592-8e83-6d8263eb257c" containerName="cloudkitty-proc" containerID="cri-o://8eb8d04d323be9531430e60db5b7a09c9649781929f56a5c5029fcea6df97bea" gracePeriod=30 Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.463408 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"dffa483a-1441-426a-b6e4-655ed6286d69","Type":"ContainerDied","Data":"de5b2cc75a1beaec1eacccfb550d9da47ec06482aed727acd85502f58e449358"} Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.463445 4830 scope.go:117] "RemoveContainer" containerID="77dd6d83427be575ba7b43aad51baec490b9cf520be0b187ad4ddd9e68942eff" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.463591 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.521568 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.526421 4830 scope.go:117] "RemoveContainer" containerID="a7cd584d48951da4bfca5d07a7bc2093cbe2dd17af6de21f63845d343c63f1a0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.534145 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.545978 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:51 crc kubenswrapper[4830]: E1203 22:27:51.546546 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.546559 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api" Dec 03 22:27:51 crc kubenswrapper[4830]: E1203 22:27:51.546606 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api-log" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.546613 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api-log" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.546892 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.546956 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" containerName="cloudkitty-api-log" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.549385 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.562225 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.562466 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.562589 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.566539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573425 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzgn\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573769 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573874 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.573972 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.574087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675644 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675661 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzgn\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.675877 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.676743 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.701654 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzgn\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.714383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.714908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.715282 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.717097 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.717981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.718622 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.718657 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " pod="openstack/cloudkitty-api-0" Dec 03 22:27:51 crc kubenswrapper[4830]: I1203 22:27:51.910672 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:27:52 crc kubenswrapper[4830]: I1203 22:27:52.505411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"499f12cf-14df-48d9-b2ee-11691c85e1ed","Type":"ContainerStarted","Data":"2540185968c257650f9dc204141995877a6872393ab2e6a28f577458700dd585"} Dec 03 22:27:52 crc kubenswrapper[4830]: I1203 22:27:52.533290 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.533274157 podStartE2EDuration="3.533274157s" podCreationTimestamp="2025-12-03 22:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:52.531750766 +0000 UTC m=+1361.528212105" watchObservedRunningTime="2025-12-03 22:27:52.533274157 +0000 UTC m=+1361.529735506" Dec 03 22:27:52 crc kubenswrapper[4830]: I1203 22:27:52.577712 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.363888 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffa483a-1441-426a-b6e4-655ed6286d69" path="/var/lib/kubelet/pods/dffa483a-1441-426a-b6e4-655ed6286d69/volumes" Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.516900 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerStarted","Data":"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563"} Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.516951 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerStarted","Data":"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129"} Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.516965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerStarted","Data":"fe5b7a4c77e70cdb61062f2b6d3965e1d41a849aa2695f9a2beb627a3c763657"} Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.517100 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.545282 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.545266518 podStartE2EDuration="2.545266518s" podCreationTimestamp="2025-12-03 22:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:53.538843523 +0000 UTC m=+1362.535304872" watchObservedRunningTime="2025-12-03 22:27:53.545266518 +0000 UTC m=+1362.541727867" Dec 03 22:27:53 crc kubenswrapper[4830]: I1203 22:27:53.778039 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79b896c7bd-gsfgm" Dec 03 22:27:54 crc kubenswrapper[4830]: I1203 22:27:54.542420 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:54 crc kubenswrapper[4830]: I1203 22:27:54.861235 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.019684 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.087131 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.088568 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.093528 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.093746 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sdrs9" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.093886 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.097570 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.113992 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.114202 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="dnsmasq-dns" containerID="cri-o://b13bee9e39f1a04463e09a086e7208e4fd3c41e28c44587fc8464d629ac2ade7" gracePeriod=10 Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.165197 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.165286 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.165385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.165451 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftdb\" (UniqueName: \"kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.266921 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.267005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftdb\" (UniqueName: \"kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.267038 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.267082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.267835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.275840 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.279270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.292385 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftdb\" (UniqueName: \"kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb\") pod \"openstackclient\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.396963 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.397711 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.433199 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6444ff6d8d-kq894" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.447345 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.475589 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.479571 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.534879 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.605421 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wm75\" (UniqueName: \"kubernetes.io/projected/670e6335-d34c-46e4-8b4d-89dbd65c35a7-kube-api-access-6wm75\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.605609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.606133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.606207 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.623814 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.624008 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-765746756b-b9w2l" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api-log" containerID="cri-o://c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d" gracePeriod=30 Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.624399 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-765746756b-b9w2l" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api" containerID="cri-o://800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c" gracePeriod=30 Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.639160 4830 generic.go:334] "Generic (PLEG): container finished" podID="1761df38-e690-4267-bc27-35ee08e90130" containerID="b13bee9e39f1a04463e09a086e7208e4fd3c41e28c44587fc8464d629ac2ade7" exitCode=0 Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.640083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" event={"ID":"1761df38-e690-4267-bc27-35ee08e90130","Type":"ContainerDied","Data":"b13bee9e39f1a04463e09a086e7208e4fd3c41e28c44587fc8464d629ac2ade7"} Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.707744 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wm75\" (UniqueName: \"kubernetes.io/projected/670e6335-d34c-46e4-8b4d-89dbd65c35a7-kube-api-access-6wm75\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.707793 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.707820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.707867 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.712355 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.740298 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.742411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wm75\" (UniqueName: \"kubernetes.io/projected/670e6335-d34c-46e4-8b4d-89dbd65c35a7-kube-api-access-6wm75\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.759169 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670e6335-d34c-46e4-8b4d-89dbd65c35a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"670e6335-d34c-46e4-8b4d-89dbd65c35a7\") " pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: E1203 22:27:55.761937 4830 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 22:27:55 crc kubenswrapper[4830]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_cdbb7cb0-5934-40ce-a8b2-df7eef10df54_0(40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385" Netns:"/var/run/netns/1ecdbd62-ba06-47a8-b571-5edc12694305" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385;K8S_POD_UID=cdbb7cb0-5934-40ce-a8b2-df7eef10df54" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/cdbb7cb0-5934-40ce-a8b2-df7eef10df54]: expected pod UID "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" but got "670e6335-d34c-46e4-8b4d-89dbd65c35a7" from Kube API Dec 03 22:27:55 crc kubenswrapper[4830]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:27:55 crc kubenswrapper[4830]: > Dec 03 22:27:55 crc kubenswrapper[4830]: E1203 22:27:55.761992 4830 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 22:27:55 crc kubenswrapper[4830]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_cdbb7cb0-5934-40ce-a8b2-df7eef10df54_0(40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385" Netns:"/var/run/netns/1ecdbd62-ba06-47a8-b571-5edc12694305" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=40f8ec6dc68b94bc15a44a223f0ec04149affbb605c120d7f73a08b1fc486385;K8S_POD_UID=cdbb7cb0-5934-40ce-a8b2-df7eef10df54" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/cdbb7cb0-5934-40ce-a8b2-df7eef10df54]: expected pod UID "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" but got "670e6335-d34c-46e4-8b4d-89dbd65c35a7" from Kube API Dec 03 22:27:55 crc kubenswrapper[4830]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:27:55 crc kubenswrapper[4830]: > pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.907974 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:55 crc kubenswrapper[4830]: I1203 22:27:55.950897 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.015970 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.016248 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.016296 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.016387 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.016436 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dllpw\" (UniqueName: \"kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.016639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config\") pod \"1761df38-e690-4267-bc27-35ee08e90130\" (UID: \"1761df38-e690-4267-bc27-35ee08e90130\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.048474 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw" (OuterVolumeSpecName: "kube-api-access-dllpw") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "kube-api-access-dllpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.110059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.121022 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.121068 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dllpw\" (UniqueName: \"kubernetes.io/projected/1761df38-e690-4267-bc27-35ee08e90130-kube-api-access-dllpw\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.186088 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.187979 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config" (OuterVolumeSpecName: "config") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.189195 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.195456 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1761df38-e690-4267-bc27-35ee08e90130" (UID: "1761df38-e690-4267-bc27-35ee08e90130"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.228746 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.228776 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.228787 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.228795 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1761df38-e690-4267-bc27-35ee08e90130-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.572069 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.662784 4830 generic.go:334] "Generic (PLEG): container finished" podID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerID="c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d" exitCode=143 Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.662869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerDied","Data":"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d"} Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.665304 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.665342 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-jrb5f" event={"ID":"1761df38-e690-4267-bc27-35ee08e90130","Type":"ContainerDied","Data":"8dbbe8efb506566257b9fa3f9a249746247083b366914afd9190ac56e9eb534a"} Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.665398 4830 scope.go:117] "RemoveContainer" containerID="b13bee9e39f1a04463e09a086e7208e4fd3c41e28c44587fc8464d629ac2ade7" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.669356 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"670e6335-d34c-46e4-8b4d-89dbd65c35a7","Type":"ContainerStarted","Data":"c91faf599be22e2bef0d06e657cd69cb41cb9620e7d17f7641b05f3126fedd1f"} Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.669912 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.674389 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cdbb7cb0-5934-40ce-a8b2-df7eef10df54" podUID="670e6335-d34c-46e4-8b4d-89dbd65c35a7" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.697223 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.710891 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.714350 4830 scope.go:117] "RemoveContainer" containerID="f805477801e379cd71323c0aac7d81c3f814d15f1dd478512e4d1db7783f4de9" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.723497 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-jrb5f"] Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.851257 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cftdb\" (UniqueName: \"kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb\") pod \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.851342 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config\") pod \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.851536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret\") pod \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.851564 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle\") pod \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\" (UID: \"cdbb7cb0-5934-40ce-a8b2-df7eef10df54\") " Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.852009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" (UID: "cdbb7cb0-5934-40ce-a8b2-df7eef10df54"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.852370 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.855809 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" (UID: "cdbb7cb0-5934-40ce-a8b2-df7eef10df54"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.862781 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" (UID: "cdbb7cb0-5934-40ce-a8b2-df7eef10df54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.862803 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb" (OuterVolumeSpecName: "kube-api-access-cftdb") pod "cdbb7cb0-5934-40ce-a8b2-df7eef10df54" (UID: "cdbb7cb0-5934-40ce-a8b2-df7eef10df54"). InnerVolumeSpecName "kube-api-access-cftdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.948812 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.954380 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cftdb\" (UniqueName: \"kubernetes.io/projected/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-kube-api-access-cftdb\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.954439 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:56 crc kubenswrapper[4830]: I1203 22:27:56.954453 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbb7cb0-5934-40ce-a8b2-df7eef10df54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:57 crc kubenswrapper[4830]: I1203 22:27:57.355191 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1761df38-e690-4267-bc27-35ee08e90130" path="/var/lib/kubelet/pods/1761df38-e690-4267-bc27-35ee08e90130/volumes" Dec 03 22:27:57 crc kubenswrapper[4830]: I1203 22:27:57.356361 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbb7cb0-5934-40ce-a8b2-df7eef10df54" path="/var/lib/kubelet/pods/cdbb7cb0-5934-40ce-a8b2-df7eef10df54/volumes" Dec 03 22:27:57 crc kubenswrapper[4830]: I1203 22:27:57.683848 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:27:57 crc kubenswrapper[4830]: I1203 22:27:57.691440 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cdbb7cb0-5934-40ce-a8b2-df7eef10df54" podUID="670e6335-d34c-46e4-8b4d-89dbd65c35a7" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.896636 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85b449dfbc-dfzlc"] Dec 03 22:27:58 crc kubenswrapper[4830]: E1203 22:27:58.897024 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="dnsmasq-dns" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.897035 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="dnsmasq-dns" Dec 03 22:27:58 crc kubenswrapper[4830]: E1203 22:27:58.897061 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="init" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.897067 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="init" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.897254 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1761df38-e690-4267-bc27-35ee08e90130" containerName="dnsmasq-dns" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.898329 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.900991 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.901174 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.901355 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.927810 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85b449dfbc-dfzlc"] Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.991770 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-combined-ca-bundle\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992108 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-run-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992142 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-internal-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992196 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-config-data\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-public-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992322 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-etc-swift\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992368 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqvp\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-kube-api-access-shqvp\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:58 crc kubenswrapper[4830]: I1203 22:27:58.992394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-log-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093800 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-combined-ca-bundle\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-run-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-internal-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093908 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-config-data\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-public-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.093991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-etc-swift\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.094020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shqvp\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-kube-api-access-shqvp\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.094037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-log-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.094932 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-run-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.094987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614c1318-6703-47ae-89e5-e9b2dd9758e3-log-httpd\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.101381 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-internal-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.104591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-combined-ca-bundle\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.105390 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-public-tls-certs\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.106590 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-etc-swift\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.101483 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c1318-6703-47ae-89e5-e9b2dd9758e3-config-data\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.127981 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqvp\" (UniqueName: \"kubernetes.io/projected/614c1318-6703-47ae-89e5-e9b2dd9758e3-kube-api-access-shqvp\") pod \"swift-proxy-85b449dfbc-dfzlc\" (UID: \"614c1318-6703-47ae-89e5-e9b2dd9758e3\") " pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.220665 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.429939 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.481770 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.482131 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-central-agent" containerID="cri-o://a5ec9370cccd68d3c38d3c9ea4404a08bf0587d4c2493dbbc5b9151554d09a82" gracePeriod=30 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.482577 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" containerID="cri-o://84ab53eace32d0a8cf5f5fdfa3f4b165d7d59972ed98a35cb8a74c43902ac1cf" gracePeriod=30 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.482757 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-notification-agent" containerID="cri-o://29dbdb75077dffbc723cbb540ba7c93101d6335227086da0f41bee27a7835d1f" gracePeriod=30 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.482809 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="sg-core" containerID="cri-o://5d352eddd42dae97311082ded3d862aa5ad5dc8ad15be9edb96e98677b5e62bf" gracePeriod=30 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.500696 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.514096 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom\") pod \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.514223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs\") pod \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.514380 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle\") pod \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.514413 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data\") pod \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.514432 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8sr9\" (UniqueName: \"kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9\") pod \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\" (UID: \"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4\") " Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.516641 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs" (OuterVolumeSpecName: "logs") pod "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" (UID: "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.525213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" (UID: "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.533962 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9" (OuterVolumeSpecName: "kube-api-access-r8sr9") pod "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" (UID: "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4"). InnerVolumeSpecName "kube-api-access-r8sr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.578462 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" (UID: "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.607716 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data" (OuterVolumeSpecName: "config-data") pod "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" (UID: "e3b7afb6-94e8-46aa-9bb4-3d2664f845a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.619421 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.619448 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.619461 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8sr9\" (UniqueName: \"kubernetes.io/projected/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-kube-api-access-r8sr9\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.619470 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.619480 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.720756 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d44c909-f792-4543-8f3e-a168e708be4f" containerID="5d352eddd42dae97311082ded3d862aa5ad5dc8ad15be9edb96e98677b5e62bf" exitCode=2 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.720838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerDied","Data":"5d352eddd42dae97311082ded3d862aa5ad5dc8ad15be9edb96e98677b5e62bf"} Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.723518 4830 generic.go:334] "Generic (PLEG): container finished" podID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerID="800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c" exitCode=0 Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.723562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerDied","Data":"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c"} Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.723589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-765746756b-b9w2l" event={"ID":"e3b7afb6-94e8-46aa-9bb4-3d2664f845a4","Type":"ContainerDied","Data":"979923b565cf271822f3c1fb2af250c661d73d02b441f0d5917add4f9eac457b"} Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.723591 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-765746756b-b9w2l" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.723606 4830 scope.go:117] "RemoveContainer" containerID="800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.763641 4830 scope.go:117] "RemoveContainer" containerID="c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.767996 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.780471 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-765746756b-b9w2l"] Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.809214 4830 scope.go:117] "RemoveContainer" containerID="800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c" Dec 03 22:27:59 crc kubenswrapper[4830]: E1203 22:27:59.813650 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c\": container with ID starting with 800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c not found: ID does not exist" containerID="800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.813688 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c"} err="failed to get container status \"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c\": rpc error: code = NotFound desc = could not find container \"800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c\": container with ID starting with 800aa85d8e61b4465db38ce83c2de04771b1f879ae8c46dd7d69959115e9645c not found: ID does not exist" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.813717 4830 scope.go:117] "RemoveContainer" containerID="c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d" Dec 03 22:27:59 crc kubenswrapper[4830]: E1203 22:27:59.815223 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d\": container with ID starting with c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d not found: ID does not exist" containerID="c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.815258 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d"} err="failed to get container status \"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d\": rpc error: code = NotFound desc = could not find container \"c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d\": container with ID starting with c406d56b2586b37c3bfb8d5d61d064f81710d894fc4e06f970818e82c0fc9e4d not found: ID does not exist" Dec 03 22:27:59 crc kubenswrapper[4830]: I1203 22:27:59.917306 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85b449dfbc-dfzlc"] Dec 03 22:27:59 crc kubenswrapper[4830]: W1203 22:27:59.926432 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614c1318_6703_47ae_89e5_e9b2dd9758e3.slice/crio-240169cab2acd3d6ba094a369cf0578ce8842f7a749b6dacd06ac6c4e90967e3 WatchSource:0}: Error finding container 240169cab2acd3d6ba094a369cf0578ce8842f7a749b6dacd06ac6c4e90967e3: Status 404 returned error can't find the container with id 240169cab2acd3d6ba094a369cf0578ce8842f7a749b6dacd06ac6c4e90967e3 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.125773 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.126466 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-log" containerID="cri-o://b9111f05eccde3e8cde1eb9e6e9e0610af3ee94be99bcabe9f8d6bd149b26ed6" gracePeriod=30 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.126646 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-httpd" containerID="cri-o://3e7c58e46ceaf242b90a388323d07ea8508b4538a702c15fdeb279488542a7e7" gracePeriod=30 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.186803 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.736267 4830 generic.go:334] "Generic (PLEG): container finished" podID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerID="b9111f05eccde3e8cde1eb9e6e9e0610af3ee94be99bcabe9f8d6bd149b26ed6" exitCode=143 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.736363 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerDied","Data":"b9111f05eccde3e8cde1eb9e6e9e0610af3ee94be99bcabe9f8d6bd149b26ed6"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.738416 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b449dfbc-dfzlc" event={"ID":"614c1318-6703-47ae-89e5-e9b2dd9758e3","Type":"ContainerStarted","Data":"66c41cff1d1f3431f961e7223c8197f4d59bec964e933cb4040090d3c3901274"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.738457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b449dfbc-dfzlc" event={"ID":"614c1318-6703-47ae-89e5-e9b2dd9758e3","Type":"ContainerStarted","Data":"dfc7217c3d38e94d98b1f5da5a9412f19880d66cbc4483096ca1568e2ab1597a"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.738472 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b449dfbc-dfzlc" event={"ID":"614c1318-6703-47ae-89e5-e9b2dd9758e3","Type":"ContainerStarted","Data":"240169cab2acd3d6ba094a369cf0578ce8842f7a749b6dacd06ac6c4e90967e3"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.738548 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.740754 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d44c909-f792-4543-8f3e-a168e708be4f" containerID="84ab53eace32d0a8cf5f5fdfa3f4b165d7d59972ed98a35cb8a74c43902ac1cf" exitCode=0 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.740780 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d44c909-f792-4543-8f3e-a168e708be4f" containerID="a5ec9370cccd68d3c38d3c9ea4404a08bf0587d4c2493dbbc5b9151554d09a82" exitCode=0 Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.740820 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerDied","Data":"84ab53eace32d0a8cf5f5fdfa3f4b165d7d59972ed98a35cb8a74c43902ac1cf"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.740843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerDied","Data":"a5ec9370cccd68d3c38d3c9ea4404a08bf0587d4c2493dbbc5b9151554d09a82"} Dec 03 22:28:00 crc kubenswrapper[4830]: I1203 22:28:00.760856 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85b449dfbc-dfzlc" podStartSLOduration=2.760837579 podStartE2EDuration="2.760837579s" podCreationTimestamp="2025-12-03 22:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:00.757746035 +0000 UTC m=+1369.754207404" watchObservedRunningTime="2025-12-03 22:28:00.760837579 +0000 UTC m=+1369.757298928" Dec 03 22:28:01 crc kubenswrapper[4830]: I1203 22:28:01.356643 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" path="/var/lib/kubelet/pods/e3b7afb6-94e8-46aa-9bb4-3d2664f845a4/volumes" Dec 03 22:28:01 crc kubenswrapper[4830]: I1203 22:28:01.786046 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d44c909-f792-4543-8f3e-a168e708be4f" containerID="29dbdb75077dffbc723cbb540ba7c93101d6335227086da0f41bee27a7835d1f" exitCode=0 Dec 03 22:28:01 crc kubenswrapper[4830]: I1203 22:28:01.786643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerDied","Data":"29dbdb75077dffbc723cbb540ba7c93101d6335227086da0f41bee27a7835d1f"} Dec 03 22:28:01 crc kubenswrapper[4830]: I1203 22:28:01.786787 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:28:03 crc kubenswrapper[4830]: I1203 22:28:03.790066 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:03 crc kubenswrapper[4830]: I1203 22:28:03.792463 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-log" containerID="cri-o://9e85cb9626d98c593715b25c00bbb3fd667dd9c4f6ae7a01c800d755aae0bed6" gracePeriod=30 Dec 03 22:28:03 crc kubenswrapper[4830]: I1203 22:28:03.792952 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-httpd" containerID="cri-o://96a1c81884ef4bde5cf0b9533ed9dc7f960a61f9d7511b2c807f4859e8ef48c7" gracePeriod=30 Dec 03 22:28:03 crc kubenswrapper[4830]: I1203 22:28:03.812863 4830 generic.go:334] "Generic (PLEG): container finished" podID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerID="3e7c58e46ceaf242b90a388323d07ea8508b4538a702c15fdeb279488542a7e7" exitCode=0 Dec 03 22:28:03 crc kubenswrapper[4830]: I1203 22:28:03.812909 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerDied","Data":"3e7c58e46ceaf242b90a388323d07ea8508b4538a702c15fdeb279488542a7e7"} Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.233036 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-765746756b-b9w2l" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.233130 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-765746756b-b9w2l" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.594377 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": dial tcp 10.217.0.167:9292: connect: connection refused" Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.594434 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": dial tcp 10.217.0.167:9292: connect: connection refused" Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.832418 4830 generic.go:334] "Generic (PLEG): container finished" podID="10e75b14-a94b-4936-bd47-9da029e04272" containerID="9e85cb9626d98c593715b25c00bbb3fd667dd9c4f6ae7a01c800d755aae0bed6" exitCode=143 Dec 03 22:28:04 crc kubenswrapper[4830]: I1203 22:28:04.832551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerDied","Data":"9e85cb9626d98c593715b25c00bbb3fd667dd9c4f6ae7a01c800d755aae0bed6"} Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.709403 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rkz52"] Dec 03 22:28:05 crc kubenswrapper[4830]: E1203 22:28:05.709781 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.709797 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api" Dec 03 22:28:05 crc kubenswrapper[4830]: E1203 22:28:05.709809 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api-log" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.709816 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api-log" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.710028 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api-log" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.710053 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b7afb6-94e8-46aa-9bb4-3d2664f845a4" containerName="barbican-api" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.710690 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.721065 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rkz52"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.763702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgjk\" (UniqueName: \"kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.763954 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.799132 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gr4gs"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.800410 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.810469 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-77a6-account-create-update-4jwrz"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.812156 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.813999 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.825044 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gr4gs"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.834568 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-77a6-account-create-update-4jwrz"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.865838 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjvbr\" (UniqueName: \"kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.865906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgjk\" (UniqueName: \"kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.865937 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.865976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgw5h\" (UniqueName: \"kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.866026 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.866043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.866889 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.904387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgjk\" (UniqueName: \"kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk\") pod \"nova-api-db-create-rkz52\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.913608 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d6c5m"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.915033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.922212 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6c5m"] Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967687 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967787 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48xh9\" (UniqueName: \"kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgw5h\" (UniqueName: \"kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967883 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.967997 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjvbr\" (UniqueName: \"kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.968965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.970976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:05 crc kubenswrapper[4830]: I1203 22:28:05.994614 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjvbr\" (UniqueName: \"kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr\") pod \"nova-cell0-db-create-gr4gs\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.005250 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgw5h\" (UniqueName: \"kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h\") pod \"nova-api-77a6-account-create-update-4jwrz\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.024476 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-616a-account-create-update-rvrn5"] Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.025835 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.028705 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.029091 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.043102 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-616a-account-create-update-rvrn5"] Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.070273 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9jd\" (UniqueName: \"kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.070344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.070375 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.070538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48xh9\" (UniqueName: \"kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.071183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.091350 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48xh9\" (UniqueName: \"kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9\") pod \"nova-cell1-db-create-d6c5m\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.132044 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.143924 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.172453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9jd\" (UniqueName: \"kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.172558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.173743 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.193073 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9jd\" (UniqueName: \"kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd\") pod \"nova-cell0-616a-account-create-update-rvrn5\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.216608 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fb1a-account-create-update-mv227"] Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.238820 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb1a-account-create-update-mv227"] Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.238939 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.240555 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.271352 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.276474 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.277182 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnpbj\" (UniqueName: \"kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.380390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.380527 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnpbj\" (UniqueName: \"kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.381399 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.399493 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnpbj\" (UniqueName: \"kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj\") pod \"nova-cell1-fb1a-account-create-update-mv227\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.440940 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.560543 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.940053 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.168:9292/healthcheck\": read tcp 10.217.0.2:38918->10.217.0.168:9292: read: connection reset by peer" Dec 03 22:28:06 crc kubenswrapper[4830]: I1203 22:28:06.940359 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.168:9292/healthcheck\": read tcp 10.217.0.2:38920->10.217.0.168:9292: read: connection reset by peer" Dec 03 22:28:07 crc kubenswrapper[4830]: I1203 22:28:07.880391 4830 generic.go:334] "Generic (PLEG): container finished" podID="10e75b14-a94b-4936-bd47-9da029e04272" containerID="96a1c81884ef4bde5cf0b9533ed9dc7f960a61f9d7511b2c807f4859e8ef48c7" exitCode=0 Dec 03 22:28:07 crc kubenswrapper[4830]: I1203 22:28:07.880715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerDied","Data":"96a1c81884ef4bde5cf0b9533ed9dc7f960a61f9d7511b2c807f4859e8ef48c7"} Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.547853 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.606489 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639052 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639096 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639122 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639221 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639303 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2bw\" (UniqueName: \"kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639377 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.639467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts\") pod \"1d44c909-f792-4543-8f3e-a168e708be4f\" (UID: \"1d44c909-f792-4543-8f3e-a168e708be4f\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.658686 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts" (OuterVolumeSpecName: "scripts") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.658971 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.660439 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.665477 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw" (OuterVolumeSpecName: "kube-api-access-dv2bw") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "kube-api-access-dv2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.700460 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.743962 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744430 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t7nf\" (UniqueName: \"kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744713 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744750 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744781 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744908 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.744992 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs\") pod \"10e75b14-a94b-4936-bd47-9da029e04272\" (UID: \"10e75b14-a94b-4936-bd47-9da029e04272\") " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.745847 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.745872 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.745884 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d44c909-f792-4543-8f3e-a168e708be4f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.745895 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv2bw\" (UniqueName: \"kubernetes.io/projected/1d44c909-f792-4543-8f3e-a168e708be4f-kube-api-access-dv2bw\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.745906 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.746282 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs" (OuterVolumeSpecName: "logs") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.748216 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf" (OuterVolumeSpecName: "kube-api-access-8t7nf") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "kube-api-access-8t7nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.749173 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.756534 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts" (OuterVolumeSpecName: "scripts") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.777191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d" (OuterVolumeSpecName: "glance") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.828219 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.828441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data" (OuterVolumeSpecName: "config-data") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.840908 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d44c909-f792-4543-8f3e-a168e708be4f" (UID: "1d44c909-f792-4543-8f3e-a168e708be4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.845267 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847438 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847468 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t7nf\" (UniqueName: \"kubernetes.io/projected/10e75b14-a94b-4936-bd47-9da029e04272-kube-api-access-8t7nf\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847479 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847488 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44c909-f792-4543-8f3e-a168e708be4f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847532 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") on node \"crc\" " Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847544 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847555 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847564 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10e75b14-a94b-4936-bd47-9da029e04272-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.847573 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.852365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data" (OuterVolumeSpecName: "config-data") pod "10e75b14-a94b-4936-bd47-9da029e04272" (UID: "10e75b14-a94b-4936-bd47-9da029e04272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.872758 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.872910 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d") on node "crc" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.900335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"670e6335-d34c-46e4-8b4d-89dbd65c35a7","Type":"ContainerStarted","Data":"39c691c899a3c70e78328ed31eef760975405a7523faf1fc291cb1a401d0488a"} Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.916259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d44c909-f792-4543-8f3e-a168e708be4f","Type":"ContainerDied","Data":"6703fce322ea6b53dc5de42b41bffb13ab46863f5f942cd75b26b417bf522cb9"} Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.916328 4830 scope.go:117] "RemoveContainer" containerID="84ab53eace32d0a8cf5f5fdfa3f4b165d7d59972ed98a35cb8a74c43902ac1cf" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.916488 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.930169 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.176887959 podStartE2EDuration="13.930148424s" podCreationTimestamp="2025-12-03 22:27:55 +0000 UTC" firstStartedPulling="2025-12-03 22:27:56.555194402 +0000 UTC m=+1365.551655751" lastFinishedPulling="2025-12-03 22:28:08.308454867 +0000 UTC m=+1377.304916216" observedRunningTime="2025-12-03 22:28:08.923975436 +0000 UTC m=+1377.920436785" watchObservedRunningTime="2025-12-03 22:28:08.930148424 +0000 UTC m=+1377.926609783" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.930704 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10e75b14-a94b-4936-bd47-9da029e04272","Type":"ContainerDied","Data":"ae233b75f6b4dccd99582c5b4d7b2a2ec695715ae60e3957ed9bef9c2ccbf382"} Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.930797 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.952023 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.952237 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e75b14-a94b-4936-bd47-9da029e04272-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.981425 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6c5m"] Dec 03 22:28:08 crc kubenswrapper[4830]: I1203 22:28:08.988923 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-77a6-account-create-update-4jwrz"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.137611 4830 scope.go:117] "RemoveContainer" containerID="5d352eddd42dae97311082ded3d862aa5ad5dc8ad15be9edb96e98677b5e62bf" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.227879 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.229391 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85b449dfbc-dfzlc" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.256496 4830 scope.go:117] "RemoveContainer" containerID="29dbdb75077dffbc723cbb540ba7c93101d6335227086da0f41bee27a7835d1f" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.256620 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.294727 4830 scope.go:117] "RemoveContainer" containerID="a5ec9370cccd68d3c38d3c9ea4404a08bf0587d4c2493dbbc5b9151554d09a82" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.330133 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.343382 4830 scope.go:117] "RemoveContainer" containerID="96a1c81884ef4bde5cf0b9533ed9dc7f960a61f9d7511b2c807f4859e8ef48c7" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366043 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366692 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5r6\" (UniqueName: \"kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366872 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.366938 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.367005 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.367029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.367051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs\") pod \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\" (UID: \"fa8d18ad-39c6-4264-9a3d-cab20b1ea138\") " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.368366 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.368802 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs" (OuterVolumeSpecName: "logs") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.391820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6" (OuterVolumeSpecName: "kube-api-access-bs5r6") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "kube-api-access-bs5r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.395009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts" (OuterVolumeSpecName: "scripts") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.404167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180" (OuterVolumeSpecName: "glance") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.421652 4830 scope.go:117] "RemoveContainer" containerID="9e85cb9626d98c593715b25c00bbb3fd667dd9c4f6ae7a01c800d755aae0bed6" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.453345 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.455807 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.456431 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-central-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.456499 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-central-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.456574 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.456632 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.456703 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.456883 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.456960 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="sg-core" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.457011 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="sg-core" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.457116 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.457171 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.457231 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-notification-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.457324 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-notification-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.457383 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.457437 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.457531 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458013 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458434 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="sg-core" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458551 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458650 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-notification-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458797 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e75b14-a94b-4936-bd47-9da029e04272" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458881 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-log" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.458950 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="ceilometer-central-agent" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.459036 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" containerName="glance-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.459134 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.461335 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.467878 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.468143 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 22:28:09 crc kubenswrapper[4830]: W1203 22:28:09.468604 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab83afa3_0096_4c4d_804e_656d1c5de542.slice/crio-ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750 WatchSource:0}: Error finding container ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750: Status 404 returned error can't find the container with id ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750 Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.468858 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.469072 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.469374 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.469626 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5r6\" (UniqueName: \"kubernetes.io/projected/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-kube-api-access-bs5r6\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.469676 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") on node \"crc\" " Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.469692 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.504088 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.516293 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.518242 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.518567 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180") on node "crc" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.531538 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data" (OuterVolumeSpecName: "config-data") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.543927 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.556006 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.557775 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa8d18ad-39c6-4264-9a3d-cab20b1ea138" (UID: "fa8d18ad-39c6-4264-9a3d-cab20b1ea138"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.558819 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.563589 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.565554 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.571943 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-config-data\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572027 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572070 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572110 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bdd7\" (UniqueName: \"kubernetes.io/projected/063860c7-63d8-4cec-b4f1-b6b501779d90-kube-api-access-7bdd7\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572138 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572168 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-scripts\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-logs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572276 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572358 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572376 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.572392 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8d18ad-39c6-4264-9a3d-cab20b1ea138-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.581989 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.600571 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb1a-account-create-update-mv227"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.616910 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gr4gs"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.627732 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rkz52"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.639003 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-616a-account-create-update-rvrn5"] Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674006 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674060 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5df2f\" (UniqueName: \"kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdd7\" (UniqueName: \"kubernetes.io/projected/063860c7-63d8-4cec-b4f1-b6b501779d90-kube-api-access-7bdd7\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674111 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674131 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674159 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-scripts\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674185 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-logs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674312 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-config-data\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.674403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.675406 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-logs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.675835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/063860c7-63d8-4cec-b4f1-b6b501779d90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.680303 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.680336 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb08e8461f959e86adb7412ffdaebbf2e0d85ec485d90c5b01024a06c9055cd6/globalmount\"" pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.687383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.688166 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-config-data\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.690176 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-scripts\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.690730 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063860c7-63d8-4cec-b4f1-b6b501779d90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.703869 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdd7\" (UniqueName: \"kubernetes.io/projected/063860c7-63d8-4cec-b4f1-b6b501779d90-kube-api-access-7bdd7\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.776477 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784598 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784628 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5df2f\" (UniqueName: \"kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784704 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784793 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.784828 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.786077 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.789794 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.790745 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.791199 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.794289 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.800308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.801611 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:09 crc kubenswrapper[4830]: E1203 22:28:09.802462 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5df2f], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="51a890da-b366-421f-885e-7c85df18f979" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.809003 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5df2f\" (UniqueName: \"kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f\") pod \"ceilometer-0\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " pod="openstack/ceilometer-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.824575 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74cb8d7c-67c7-4974-a810-352da3cb5d8d\") pod \"glance-default-external-api-0\" (UID: \"063860c7-63d8-4cec-b4f1-b6b501779d90\") " pod="openstack/glance-default-external-api-0" Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.975992 4830 generic.go:334] "Generic (PLEG): container finished" podID="05a32af7-077b-4320-bdc0-a208b4132154" containerID="44fdaccdcd8d7a47e8d2039990ec16eab1d6a698b38bb6a5f9019e5c18399c69" exitCode=0 Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.976791 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6c5m" event={"ID":"05a32af7-077b-4320-bdc0-a208b4132154","Type":"ContainerDied","Data":"44fdaccdcd8d7a47e8d2039990ec16eab1d6a698b38bb6a5f9019e5c18399c69"} Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.976840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6c5m" event={"ID":"05a32af7-077b-4320-bdc0-a208b4132154","Type":"ContainerStarted","Data":"2e8b34269ea4b0ca48724644614d89c0aa9b1b617b809e333626ac299a320b68"} Dec 03 22:28:09 crc kubenswrapper[4830]: I1203 22:28:09.986973 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" event={"ID":"42d8b7eb-e5b5-4445-9b9d-2a0472500fea","Type":"ContainerStarted","Data":"4598499cbff2237fdf226e8d4510fbad70cbd3297076d3c7eed655450e96e1fb"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.015235 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" podStartSLOduration=5.015218161 podStartE2EDuration="5.015218161s" podCreationTimestamp="2025-12-03 22:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:10.004518761 +0000 UTC m=+1379.000980110" watchObservedRunningTime="2025-12-03 22:28:10.015218161 +0000 UTC m=+1379.011679510" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.016010 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.018396 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8d18ad-39c6-4264-9a3d-cab20b1ea138","Type":"ContainerDied","Data":"88ee8a3ca1fbdd48d73582bd9e684726e3b6445920e298f7779e39d48b22ec47"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.018469 4830 scope.go:117] "RemoveContainer" containerID="3e7c58e46ceaf242b90a388323d07ea8508b4538a702c15fdeb279488542a7e7" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.026397 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rkz52" event={"ID":"67505812-b68d-4b56-b353-749d2f12dcdd","Type":"ContainerStarted","Data":"c3d108ade0d047e492aab8ef453106c2e7660dc070a4eb7cb47f6adc9b88097b"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.041812 4830 generic.go:334] "Generic (PLEG): container finished" podID="630ed790-c3e9-471e-96a4-8f3c63aa837e" containerID="d1d1841f0fac9bace30726b0e572184eb687046ada40ac2f7cdd7112b3eec89d" exitCode=0 Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.041875 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-77a6-account-create-update-4jwrz" event={"ID":"630ed790-c3e9-471e-96a4-8f3c63aa837e","Type":"ContainerDied","Data":"d1d1841f0fac9bace30726b0e572184eb687046ada40ac2f7cdd7112b3eec89d"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.041902 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-77a6-account-create-update-4jwrz" event={"ID":"630ed790-c3e9-471e-96a4-8f3c63aa837e","Type":"ContainerStarted","Data":"6d27353a0751264fa43694060ec1673521af11e94cf87f6495b11ce044af3b98"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.045841 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-rkz52" podStartSLOduration=5.04583076 podStartE2EDuration="5.04583076s" podCreationTimestamp="2025-12-03 22:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:10.042330545 +0000 UTC m=+1379.038791894" watchObservedRunningTime="2025-12-03 22:28:10.04583076 +0000 UTC m=+1379.042292099" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.050035 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gr4gs" event={"ID":"ab83afa3-0096-4c4d-804e-656d1c5de542","Type":"ContainerStarted","Data":"6296868ed56ae39905acb727e8ee3bb0f53adafefcadfa044fbed1f2e7949f90"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.050074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gr4gs" event={"ID":"ab83afa3-0096-4c4d-804e-656d1c5de542","Type":"ContainerStarted","Data":"ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.052108 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.052718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" event={"ID":"37ff28df-2c9e-4694-a75d-9b7dd825c575","Type":"ContainerStarted","Data":"9d68b27bfcbbe782e3d3d689df2dc58130c516865ed56155f590e7bed57cae04"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.052754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" event={"ID":"37ff28df-2c9e-4694-a75d-9b7dd825c575","Type":"ContainerStarted","Data":"894ede9086c108d1cfd271c936feb16da395df3169ad27c04551cf47357ec464"} Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.091419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.128154 4830 scope.go:117] "RemoveContainer" containerID="b9111f05eccde3e8cde1eb9e6e9e0610af3ee94be99bcabe9f8d6bd149b26ed6" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.132754 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.150379 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-gr4gs" podStartSLOduration=5.150337078 podStartE2EDuration="5.150337078s" podCreationTimestamp="2025-12-03 22:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:10.091759193 +0000 UTC m=+1379.088220542" watchObservedRunningTime="2025-12-03 22:28:10.150337078 +0000 UTC m=+1379.146798427" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.200119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.200660 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.200942 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.201476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.201654 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5df2f\" (UniqueName: \"kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.201838 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.204207 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts\") pod \"51a890da-b366-421f-885e-7c85df18f979\" (UID: \"51a890da-b366-421f-885e-7c85df18f979\") " Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.200613 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.200896 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.206345 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.207089 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f" (OuterVolumeSpecName: "kube-api-access-5df2f") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "kube-api-access-5df2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.207593 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data" (OuterVolumeSpecName: "config-data") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.208470 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.211850 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts" (OuterVolumeSpecName: "scripts") pod "51a890da-b366-421f-885e-7c85df18f979" (UID: "51a890da-b366-421f-885e-7c85df18f979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.216683 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.225435 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.235027 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.237213 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.240593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.240842 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.242402 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.306411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.306578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308700 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308759 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqh5c\" (UniqueName: \"kubernetes.io/projected/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-kube-api-access-hqh5c\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308984 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5df2f\" (UniqueName: \"kubernetes.io/projected/51a890da-b366-421f-885e-7c85df18f979-kube-api-access-5df2f\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.308996 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.309006 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.309016 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.309024 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51a890da-b366-421f-885e-7c85df18f979-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.309032 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.309041 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51a890da-b366-421f-885e-7c85df18f979-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410528 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410560 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410673 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqh5c\" (UniqueName: \"kubernetes.io/projected/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-kube-api-access-hqh5c\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410774 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.410898 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.415081 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.415248 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.415723 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.415778 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84423eab78faf98d19a8f63b4de761c719e5f7e98ae028cf8dc7a99f1fabf2c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.419674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.421253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.421441 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.422013 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.428349 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqh5c\" (UniqueName: \"kubernetes.io/projected/ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62-kube-api-access-hqh5c\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.468572 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb6ab83e-1b8e-472e-af3c-287ab6724180\") pod \"glance-default-internal-api-0\" (UID: \"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62\") " pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.531451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:10 crc kubenswrapper[4830]: I1203 22:28:10.755763 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.070379 4830 generic.go:334] "Generic (PLEG): container finished" podID="37ff28df-2c9e-4694-a75d-9b7dd825c575" containerID="9d68b27bfcbbe782e3d3d689df2dc58130c516865ed56155f590e7bed57cae04" exitCode=0 Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.070438 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" event={"ID":"37ff28df-2c9e-4694-a75d-9b7dd825c575","Type":"ContainerDied","Data":"9d68b27bfcbbe782e3d3d689df2dc58130c516865ed56155f590e7bed57cae04"} Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.074243 4830 generic.go:334] "Generic (PLEG): container finished" podID="ab83afa3-0096-4c4d-804e-656d1c5de542" containerID="6296868ed56ae39905acb727e8ee3bb0f53adafefcadfa044fbed1f2e7949f90" exitCode=0 Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.074296 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gr4gs" event={"ID":"ab83afa3-0096-4c4d-804e-656d1c5de542","Type":"ContainerDied","Data":"6296868ed56ae39905acb727e8ee3bb0f53adafefcadfa044fbed1f2e7949f90"} Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.078556 4830 generic.go:334] "Generic (PLEG): container finished" podID="42d8b7eb-e5b5-4445-9b9d-2a0472500fea" containerID="f3995af32947864f39da4fb01355ffba1b0884dba503918c90f6f07ef9014fdd" exitCode=0 Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.078631 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" event={"ID":"42d8b7eb-e5b5-4445-9b9d-2a0472500fea","Type":"ContainerDied","Data":"f3995af32947864f39da4fb01355ffba1b0884dba503918c90f6f07ef9014fdd"} Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.078851 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.081777 4830 generic.go:334] "Generic (PLEG): container finished" podID="67505812-b68d-4b56-b353-749d2f12dcdd" containerID="91a922f2854aaf25588a516e247bf912435bcd2fb83a3771a3bb4c41c9d538c9" exitCode=0 Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.081824 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rkz52" event={"ID":"67505812-b68d-4b56-b353-749d2f12dcdd","Type":"ContainerDied","Data":"91a922f2854aaf25588a516e247bf912435bcd2fb83a3771a3bb4c41c9d538c9"} Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.084781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"063860c7-63d8-4cec-b4f1-b6b501779d90","Type":"ContainerStarted","Data":"a4681da1166a0a7a345a14392e6b70a815424c53b4c1713699d3b7b058f11d5b"} Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.084840 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: W1203 22:28:11.122815 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4f6693_f6bc_4a8c_b836_f0aaea3bfc62.slice/crio-c2ec6d8c7b19cc8685afff09c705bfbccf8563e6c2229dc53835204f161f6e1f WatchSource:0}: Error finding container c2ec6d8c7b19cc8685afff09c705bfbccf8563e6c2229dc53835204f161f6e1f: Status 404 returned error can't find the container with id c2ec6d8c7b19cc8685afff09c705bfbccf8563e6c2229dc53835204f161f6e1f Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.212213 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.223481 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.236716 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.239793 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.244498 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.244664 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.250071 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.333800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334383 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkgk\" (UniqueName: \"kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.334498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.369967 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e75b14-a94b-4936-bd47-9da029e04272" path="/var/lib/kubelet/pods/10e75b14-a94b-4936-bd47-9da029e04272/volumes" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.388866 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" path="/var/lib/kubelet/pods/1d44c909-f792-4543-8f3e-a168e708be4f/volumes" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.390017 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a890da-b366-421f-885e-7c85df18f979" path="/var/lib/kubelet/pods/51a890da-b366-421f-885e-7c85df18f979/volumes" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.396072 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8d18ad-39c6-4264-9a3d-cab20b1ea138" path="/var/lib/kubelet/pods/fa8d18ad-39c6-4264-9a3d-cab20b1ea138/volumes" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438101 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkgk\" (UniqueName: \"kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438353 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.438575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.440350 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.440937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.443634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.450193 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.450593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.456955 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.461806 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.462758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkgk\" (UniqueName: \"kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.463099 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.466125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data\") pod \"ceilometer-0\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.588054 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.681841 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.753175 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts\") pod \"37ff28df-2c9e-4694-a75d-9b7dd825c575\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.753255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnpbj\" (UniqueName: \"kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj\") pod \"37ff28df-2c9e-4694-a75d-9b7dd825c575\" (UID: \"37ff28df-2c9e-4694-a75d-9b7dd825c575\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.755999 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37ff28df-2c9e-4694-a75d-9b7dd825c575" (UID: "37ff28df-2c9e-4694-a75d-9b7dd825c575"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.770950 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj" (OuterVolumeSpecName: "kube-api-access-qnpbj") pod "37ff28df-2c9e-4694-a75d-9b7dd825c575" (UID: "37ff28df-2c9e-4694-a75d-9b7dd825c575"). InnerVolumeSpecName "kube-api-access-qnpbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.824807 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.856019 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37ff28df-2c9e-4694-a75d-9b7dd825c575-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.856042 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnpbj\" (UniqueName: \"kubernetes.io/projected/37ff28df-2c9e-4694-a75d-9b7dd825c575-kube-api-access-qnpbj\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.875848 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.957553 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48xh9\" (UniqueName: \"kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9\") pod \"05a32af7-077b-4320-bdc0-a208b4132154\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.958663 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgw5h\" (UniqueName: \"kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h\") pod \"630ed790-c3e9-471e-96a4-8f3c63aa837e\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.958758 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts\") pod \"05a32af7-077b-4320-bdc0-a208b4132154\" (UID: \"05a32af7-077b-4320-bdc0-a208b4132154\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.959150 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts\") pod \"630ed790-c3e9-471e-96a4-8f3c63aa837e\" (UID: \"630ed790-c3e9-471e-96a4-8f3c63aa837e\") " Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.960333 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "630ed790-c3e9-471e-96a4-8f3c63aa837e" (UID: "630ed790-c3e9-471e-96a4-8f3c63aa837e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.960846 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05a32af7-077b-4320-bdc0-a208b4132154" (UID: "05a32af7-077b-4320-bdc0-a208b4132154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.966256 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9" (OuterVolumeSpecName: "kube-api-access-48xh9") pod "05a32af7-077b-4320-bdc0-a208b4132154" (UID: "05a32af7-077b-4320-bdc0-a208b4132154"). InnerVolumeSpecName "kube-api-access-48xh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:11 crc kubenswrapper[4830]: I1203 22:28:11.974913 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h" (OuterVolumeSpecName: "kube-api-access-mgw5h") pod "630ed790-c3e9-471e-96a4-8f3c63aa837e" (UID: "630ed790-c3e9-471e-96a4-8f3c63aa837e"). InnerVolumeSpecName "kube-api-access-mgw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.061409 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48xh9\" (UniqueName: \"kubernetes.io/projected/05a32af7-077b-4320-bdc0-a208b4132154-kube-api-access-48xh9\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.061438 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgw5h\" (UniqueName: \"kubernetes.io/projected/630ed790-c3e9-471e-96a4-8f3c63aa837e-kube-api-access-mgw5h\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.061448 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05a32af7-077b-4320-bdc0-a208b4132154-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.061458 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630ed790-c3e9-471e-96a4-8f3c63aa837e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.098494 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"063860c7-63d8-4cec-b4f1-b6b501779d90","Type":"ContainerStarted","Data":"b07b265b2a40c8b6c030f361eaf2cb42817d4e534d70076168899942f2f5714f"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.100885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" event={"ID":"37ff28df-2c9e-4694-a75d-9b7dd825c575","Type":"ContainerDied","Data":"894ede9086c108d1cfd271c936feb16da395df3169ad27c04551cf47357ec464"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.101081 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="894ede9086c108d1cfd271c936feb16da395df3169ad27c04551cf47357ec464" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.100900 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb1a-account-create-update-mv227" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.112632 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-77a6-account-create-update-4jwrz" event={"ID":"630ed790-c3e9-471e-96a4-8f3c63aa837e","Type":"ContainerDied","Data":"6d27353a0751264fa43694060ec1673521af11e94cf87f6495b11ce044af3b98"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.112665 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d27353a0751264fa43694060ec1673521af11e94cf87f6495b11ce044af3b98" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.112705 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-77a6-account-create-update-4jwrz" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.119006 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6c5m" event={"ID":"05a32af7-077b-4320-bdc0-a208b4132154","Type":"ContainerDied","Data":"2e8b34269ea4b0ca48724644614d89c0aa9b1b617b809e333626ac299a320b68"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.119034 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e8b34269ea4b0ca48724644614d89c0aa9b1b617b809e333626ac299a320b68" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.119076 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6c5m" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.123582 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62","Type":"ContainerStarted","Data":"0955ad370217ad2b04fae7986253ae2d73dbc3962089ea50b8d7e1d2d8293f31"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.123611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62","Type":"ContainerStarted","Data":"c2ec6d8c7b19cc8685afff09c705bfbccf8563e6c2229dc53835204f161f6e1f"} Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.221869 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.715538 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.759217 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.775460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts\") pod \"67505812-b68d-4b56-b353-749d2f12dcdd\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.775574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fgjk\" (UniqueName: \"kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk\") pod \"67505812-b68d-4b56-b353-749d2f12dcdd\" (UID: \"67505812-b68d-4b56-b353-749d2f12dcdd\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.777094 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67505812-b68d-4b56-b353-749d2f12dcdd" (UID: "67505812-b68d-4b56-b353-749d2f12dcdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.790911 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk" (OuterVolumeSpecName: "kube-api-access-2fgjk") pod "67505812-b68d-4b56-b353-749d2f12dcdd" (UID: "67505812-b68d-4b56-b353-749d2f12dcdd"). InnerVolumeSpecName "kube-api-access-2fgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.833972 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.880335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjvbr\" (UniqueName: \"kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr\") pod \"ab83afa3-0096-4c4d-804e-656d1c5de542\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.880747 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf9jd\" (UniqueName: \"kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd\") pod \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.880791 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts\") pod \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\" (UID: \"42d8b7eb-e5b5-4445-9b9d-2a0472500fea\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.880814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts\") pod \"ab83afa3-0096-4c4d-804e-656d1c5de542\" (UID: \"ab83afa3-0096-4c4d-804e-656d1c5de542\") " Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.881290 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67505812-b68d-4b56-b353-749d2f12dcdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.881307 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fgjk\" (UniqueName: \"kubernetes.io/projected/67505812-b68d-4b56-b353-749d2f12dcdd-kube-api-access-2fgjk\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.881664 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab83afa3-0096-4c4d-804e-656d1c5de542" (UID: "ab83afa3-0096-4c4d-804e-656d1c5de542"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.881850 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42d8b7eb-e5b5-4445-9b9d-2a0472500fea" (UID: "42d8b7eb-e5b5-4445-9b9d-2a0472500fea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.887739 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr" (OuterVolumeSpecName: "kube-api-access-hjvbr") pod "ab83afa3-0096-4c4d-804e-656d1c5de542" (UID: "ab83afa3-0096-4c4d-804e-656d1c5de542"). InnerVolumeSpecName "kube-api-access-hjvbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.891634 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd" (OuterVolumeSpecName: "kube-api-access-gf9jd") pod "42d8b7eb-e5b5-4445-9b9d-2a0472500fea" (UID: "42d8b7eb-e5b5-4445-9b9d-2a0472500fea"). InnerVolumeSpecName "kube-api-access-gf9jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.984138 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjvbr\" (UniqueName: \"kubernetes.io/projected/ab83afa3-0096-4c4d-804e-656d1c5de542-kube-api-access-hjvbr\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.984206 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf9jd\" (UniqueName: \"kubernetes.io/projected/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-kube-api-access-gf9jd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.984237 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42d8b7eb-e5b5-4445-9b9d-2a0472500fea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:12 crc kubenswrapper[4830]: I1203 22:28:12.984262 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab83afa3-0096-4c4d-804e-656d1c5de542-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.021040 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.141257 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62","Type":"ContainerStarted","Data":"bedd0c504dca1ebbb3220ebbc51adf7d28f1165ffb6b1983ab14b8933d617e93"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.142785 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rkz52" event={"ID":"67505812-b68d-4b56-b353-749d2f12dcdd","Type":"ContainerDied","Data":"c3d108ade0d047e492aab8ef453106c2e7660dc070a4eb7cb47f6adc9b88097b"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.142824 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d108ade0d047e492aab8ef453106c2e7660dc070a4eb7cb47f6adc9b88097b" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.142866 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rkz52" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.156172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerStarted","Data":"7c323900508da01bf99fadd2612613913fdba8882372a9d650657ec4cfb605fa"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.161887 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"063860c7-63d8-4cec-b4f1-b6b501779d90","Type":"ContainerStarted","Data":"824dc99fef707d13b098d7142c69483c09a137f4e2734813630f0d55a5b3eb5d"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.166870 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gr4gs" event={"ID":"ab83afa3-0096-4c4d-804e-656d1c5de542","Type":"ContainerDied","Data":"ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.166894 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2a398323e075f26b7bd48db55dfa328dc5011744ec10d143d7447202a7b750" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.166942 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gr4gs" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.171821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" event={"ID":"42d8b7eb-e5b5-4445-9b9d-2a0472500fea","Type":"ContainerDied","Data":"4598499cbff2237fdf226e8d4510fbad70cbd3297076d3c7eed655450e96e1fb"} Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.171864 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4598499cbff2237fdf226e8d4510fbad70cbd3297076d3c7eed655450e96e1fb" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.171924 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-616a-account-create-update-rvrn5" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.184706 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.184681273 podStartE2EDuration="3.184681273s" podCreationTimestamp="2025-12-03 22:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:13.172679339 +0000 UTC m=+1382.169140688" watchObservedRunningTime="2025-12-03 22:28:13.184681273 +0000 UTC m=+1382.181142622" Dec 03 22:28:13 crc kubenswrapper[4830]: I1203 22:28:13.214024 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.214006337 podStartE2EDuration="4.214006337s" podCreationTimestamp="2025-12-03 22:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:13.196580346 +0000 UTC m=+1382.193041695" watchObservedRunningTime="2025-12-03 22:28:13.214006337 +0000 UTC m=+1382.210467686" Dec 03 22:28:14 crc kubenswrapper[4830]: I1203 22:28:14.185211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerStarted","Data":"03ce774e713a1c2e075856a9a4aefa62679a21b997d09d8326bc6d1f209d0788"} Dec 03 22:28:14 crc kubenswrapper[4830]: I1203 22:28:14.185551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerStarted","Data":"7ced4866fca7684317b1866965340d5324e664a98ead6897195ebd44fa8fce5c"} Dec 03 22:28:15 crc kubenswrapper[4830]: I1203 22:28:15.201102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerStarted","Data":"26d8332b3b8aac79ff6d5d59ed3b12be2e1f2ed223f98ed8131694b13868e279"} Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerStarted","Data":"3363ceeef6da78e752bca7169f942bd2501da74c38fddf7bb4019fc7cb02177c"} Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213353 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213254 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-central-agent" containerID="cri-o://7ced4866fca7684317b1866965340d5324e664a98ead6897195ebd44fa8fce5c" gracePeriod=30 Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213358 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="sg-core" containerID="cri-o://26d8332b3b8aac79ff6d5d59ed3b12be2e1f2ed223f98ed8131694b13868e279" gracePeriod=30 Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213402 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="proxy-httpd" containerID="cri-o://3363ceeef6da78e752bca7169f942bd2501da74c38fddf7bb4019fc7cb02177c" gracePeriod=30 Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.213354 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-notification-agent" containerID="cri-o://03ce774e713a1c2e075856a9a4aefa62679a21b997d09d8326bc6d1f209d0788" gracePeriod=30 Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.264209 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.98171295 podStartE2EDuration="5.264183191s" podCreationTimestamp="2025-12-03 22:28:11 +0000 UTC" firstStartedPulling="2025-12-03 22:28:12.25679704 +0000 UTC m=+1381.253258389" lastFinishedPulling="2025-12-03 22:28:15.539267261 +0000 UTC m=+1384.535728630" observedRunningTime="2025-12-03 22:28:16.239624297 +0000 UTC m=+1385.236085646" watchObservedRunningTime="2025-12-03 22:28:16.264183191 +0000 UTC m=+1385.260644550" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.381575 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nb55g"] Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382355 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67505812-b68d-4b56-b353-749d2f12dcdd" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382373 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="67505812-b68d-4b56-b353-749d2f12dcdd" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382395 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab83afa3-0096-4c4d-804e-656d1c5de542" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382401 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab83afa3-0096-4c4d-804e-656d1c5de542" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382418 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ff28df-2c9e-4694-a75d-9b7dd825c575" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382424 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ff28df-2c9e-4694-a75d-9b7dd825c575" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382435 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a32af7-077b-4320-bdc0-a208b4132154" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382441 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a32af7-077b-4320-bdc0-a208b4132154" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382454 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d8b7eb-e5b5-4445-9b9d-2a0472500fea" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382460 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d8b7eb-e5b5-4445-9b9d-2a0472500fea" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: E1203 22:28:16.382473 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630ed790-c3e9-471e-96a4-8f3c63aa837e" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382479 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="630ed790-c3e9-471e-96a4-8f3c63aa837e" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382678 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="630ed790-c3e9-471e-96a4-8f3c63aa837e" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382693 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a32af7-077b-4320-bdc0-a208b4132154" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382705 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab83afa3-0096-4c4d-804e-656d1c5de542" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382724 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="67505812-b68d-4b56-b353-749d2f12dcdd" containerName="mariadb-database-create" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382734 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ff28df-2c9e-4694-a75d-9b7dd825c575" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.382744 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d8b7eb-e5b5-4445-9b9d-2a0472500fea" containerName="mariadb-account-create-update" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.383533 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.386278 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.386588 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hw9lf" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.386809 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.388938 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nb55g"] Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.471716 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.471868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdlx4\" (UniqueName: \"kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.471897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.471948 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.573092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.573170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.573262 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdlx4\" (UniqueName: \"kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.573284 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.578146 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.580849 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.581315 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.588428 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdlx4\" (UniqueName: \"kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4\") pod \"nova-cell0-conductor-db-sync-nb55g\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:16 crc kubenswrapper[4830]: I1203 22:28:16.702140 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.249990 4830 generic.go:334] "Generic (PLEG): container finished" podID="4bef5f28-8253-443d-847a-05a757c42fa1" containerID="3363ceeef6da78e752bca7169f942bd2501da74c38fddf7bb4019fc7cb02177c" exitCode=0 Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.250021 4830 generic.go:334] "Generic (PLEG): container finished" podID="4bef5f28-8253-443d-847a-05a757c42fa1" containerID="26d8332b3b8aac79ff6d5d59ed3b12be2e1f2ed223f98ed8131694b13868e279" exitCode=2 Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.250029 4830 generic.go:334] "Generic (PLEG): container finished" podID="4bef5f28-8253-443d-847a-05a757c42fa1" containerID="03ce774e713a1c2e075856a9a4aefa62679a21b997d09d8326bc6d1f209d0788" exitCode=0 Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.250050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerDied","Data":"3363ceeef6da78e752bca7169f942bd2501da74c38fddf7bb4019fc7cb02177c"} Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.250074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerDied","Data":"26d8332b3b8aac79ff6d5d59ed3b12be2e1f2ed223f98ed8131694b13868e279"} Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.250085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerDied","Data":"03ce774e713a1c2e075856a9a4aefa62679a21b997d09d8326bc6d1f209d0788"} Dec 03 22:28:17 crc kubenswrapper[4830]: I1203 22:28:17.308594 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nb55g"] Dec 03 22:28:18 crc kubenswrapper[4830]: I1203 22:28:18.270413 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nb55g" event={"ID":"026064cc-f701-4574-baf6-261e157061a8","Type":"ContainerStarted","Data":"0f098f7a2248ec2063ab9eee547632c1ce8d0c90dc7a012e62d07ad641454a6e"} Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.134449 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.134798 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.167433 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.173428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.291244 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.291279 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.532054 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.532126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.575987 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:20 crc kubenswrapper[4830]: I1203 22:28:20.587568 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:21 crc kubenswrapper[4830]: I1203 22:28:21.299585 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:21 crc kubenswrapper[4830]: I1203 22:28:21.299920 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:22 crc kubenswrapper[4830]: I1203 22:28:22.316923 4830 generic.go:334] "Generic (PLEG): container finished" podID="737c1ba4-5eac-4592-8e83-6d8263eb257c" containerID="8eb8d04d323be9531430e60db5b7a09c9649781929f56a5c5029fcea6df97bea" exitCode=137 Dec 03 22:28:22 crc kubenswrapper[4830]: I1203 22:28:22.317179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"737c1ba4-5eac-4592-8e83-6d8263eb257c","Type":"ContainerDied","Data":"8eb8d04d323be9531430e60db5b7a09c9649781929f56a5c5029fcea6df97bea"} Dec 03 22:28:22 crc kubenswrapper[4830]: I1203 22:28:22.430734 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 22:28:22 crc kubenswrapper[4830]: I1203 22:28:22.430860 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:28:22 crc kubenswrapper[4830]: I1203 22:28:22.433643 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 22:28:23 crc kubenswrapper[4830]: I1203 22:28:23.256365 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:23 crc kubenswrapper[4830]: I1203 22:28:23.325697 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:28:23 crc kubenswrapper[4830]: I1203 22:28:23.330555 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 22:28:25 crc kubenswrapper[4830]: I1203 22:28:25.351310 4830 generic.go:334] "Generic (PLEG): container finished" podID="4bef5f28-8253-443d-847a-05a757c42fa1" containerID="7ced4866fca7684317b1866965340d5324e664a98ead6897195ebd44fa8fce5c" exitCode=0 Dec 03 22:28:25 crc kubenswrapper[4830]: I1203 22:28:25.351378 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerDied","Data":"7ced4866fca7684317b1866965340d5324e664a98ead6897195ebd44fa8fce5c"} Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.373381 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"737c1ba4-5eac-4592-8e83-6d8263eb257c","Type":"ContainerDied","Data":"2f61caab540b96ec885ee096948d7120a7c71055a2ccb7ca2b430261279d0dc3"} Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.373795 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f61caab540b96ec885ee096948d7120a7c71055a2ccb7ca2b430261279d0dc3" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.376114 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef5f28-8253-443d-847a-05a757c42fa1","Type":"ContainerDied","Data":"7c323900508da01bf99fadd2612613913fdba8882372a9d650657ec4cfb605fa"} Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.376139 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c323900508da01bf99fadd2612613913fdba8882372a9d650657ec4cfb605fa" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.443562 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.460766 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497015 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497184 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497290 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497344 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497386 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497423 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497503 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497616 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzdn2\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497659 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497720 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkgk\" (UniqueName: \"kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497759 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data\") pod \"4bef5f28-8253-443d-847a-05a757c42fa1\" (UID: \"4bef5f28-8253-443d-847a-05a757c42fa1\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.497794 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs\") pod \"737c1ba4-5eac-4592-8e83-6d8263eb257c\" (UID: \"737c1ba4-5eac-4592-8e83-6d8263eb257c\") " Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.498554 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.499497 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.500525 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.500833 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.513231 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts" (OuterVolumeSpecName: "scripts") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.513238 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts" (OuterVolumeSpecName: "scripts") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.515191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk" (OuterVolumeSpecName: "kube-api-access-4pkgk") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "kube-api-access-4pkgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.522435 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs" (OuterVolumeSpecName: "certs") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.526267 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2" (OuterVolumeSpecName: "kube-api-access-lzdn2") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "kube-api-access-lzdn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602180 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602216 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzdn2\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-kube-api-access-lzdn2\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602230 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkgk\" (UniqueName: \"kubernetes.io/projected/4bef5f28-8253-443d-847a-05a757c42fa1-kube-api-access-4pkgk\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602238 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/737c1ba4-5eac-4592-8e83-6d8263eb257c-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602246 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602254 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef5f28-8253-443d-847a-05a757c42fa1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.602263 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.604680 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.615242 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data" (OuterVolumeSpecName: "config-data") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.623528 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737c1ba4-5eac-4592-8e83-6d8263eb257c" (UID: "737c1ba4-5eac-4592-8e83-6d8263eb257c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.671724 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.683191 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.683237 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.686023 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data" (OuterVolumeSpecName: "config-data") pod "4bef5f28-8253-443d-847a-05a757c42fa1" (UID: "4bef5f28-8253-443d-847a-05a757c42fa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.704216 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.704247 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.704256 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.704265 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737c1ba4-5eac-4592-8e83-6d8263eb257c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:26 crc kubenswrapper[4830]: I1203 22:28:26.704274 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef5f28-8253-443d-847a-05a757c42fa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.387166 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nb55g" event={"ID":"026064cc-f701-4574-baf6-261e157061a8","Type":"ContainerStarted","Data":"ce7de5cea40f94b237ce0a817171e7fc448448770ce0e17bd595172e4545c28c"} Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.387196 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.387212 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.420566 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-nb55g" podStartSLOduration=2.388053854 podStartE2EDuration="11.420545521s" podCreationTimestamp="2025-12-03 22:28:16 +0000 UTC" firstStartedPulling="2025-12-03 22:28:17.282693407 +0000 UTC m=+1386.279154756" lastFinishedPulling="2025-12-03 22:28:26.315185074 +0000 UTC m=+1395.311646423" observedRunningTime="2025-12-03 22:28:27.402277097 +0000 UTC m=+1396.398738446" watchObservedRunningTime="2025-12-03 22:28:27.420545521 +0000 UTC m=+1396.417006870" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.431975 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.446033 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.455255 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.468183 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.476907 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: E1203 22:28:27.477559 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-central-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.477641 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-central-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: E1203 22:28:27.477698 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-notification-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.477760 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-notification-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: E1203 22:28:27.477829 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="proxy-httpd" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.477879 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="proxy-httpd" Dec 03 22:28:27 crc kubenswrapper[4830]: E1203 22:28:27.477929 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737c1ba4-5eac-4592-8e83-6d8263eb257c" containerName="cloudkitty-proc" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.477979 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="737c1ba4-5eac-4592-8e83-6d8263eb257c" containerName="cloudkitty-proc" Dec 03 22:28:27 crc kubenswrapper[4830]: E1203 22:28:27.478048 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="sg-core" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478104 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="sg-core" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478387 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-central-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478531 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="sg-core" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478608 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="ceilometer-notification-agent" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478681 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" containerName="proxy-httpd" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.478756 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="737c1ba4-5eac-4592-8e83-6d8263eb257c" containerName="cloudkitty-proc" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.479531 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.481953 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.485155 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.493524 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.495931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.500331 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.502118 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.509243 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.520602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.522384 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.522516 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6c54\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.522657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.522755 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.522846 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624336 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624468 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624485 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624538 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524g8\" (UniqueName: \"kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624589 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6c54\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624643 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624683 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.624709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.628976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.629417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.630559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.640227 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.641086 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.645959 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6c54\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54\") pod \"cloudkitty-proc-0\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.725999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726071 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726102 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524g8\" (UniqueName: \"kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726214 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726271 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726575 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.726903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.730205 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.733304 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.733319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.733467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.742595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524g8\" (UniqueName: \"kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8\") pod \"ceilometer-0\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " pod="openstack/ceilometer-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.798152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:28:27 crc kubenswrapper[4830]: I1203 22:28:27.817587 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:28 crc kubenswrapper[4830]: I1203 22:28:28.073835 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:28 crc kubenswrapper[4830]: I1203 22:28:28.326019 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:28:28 crc kubenswrapper[4830]: I1203 22:28:28.399299 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192","Type":"ContainerStarted","Data":"40b3cc210fac68d172ad919ba131e9593a1e6659f030a67e2e5a7d943e900512"} Dec 03 22:28:28 crc kubenswrapper[4830]: W1203 22:28:28.496046 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30c149b6_9f60_45fa_8289_34c1ceb3a7f2.slice/crio-1d9e945d6d513fa83098f9e6b9184d9f64329f494000284c56ef2c0b9602deb4 WatchSource:0}: Error finding container 1d9e945d6d513fa83098f9e6b9184d9f64329f494000284c56ef2c0b9602deb4: Status 404 returned error can't find the container with id 1d9e945d6d513fa83098f9e6b9184d9f64329f494000284c56ef2c0b9602deb4 Dec 03 22:28:28 crc kubenswrapper[4830]: I1203 22:28:28.499460 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.358695 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bef5f28-8253-443d-847a-05a757c42fa1" path="/var/lib/kubelet/pods/4bef5f28-8253-443d-847a-05a757c42fa1/volumes" Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.360055 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737c1ba4-5eac-4592-8e83-6d8263eb257c" path="/var/lib/kubelet/pods/737c1ba4-5eac-4592-8e83-6d8263eb257c/volumes" Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.413047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerStarted","Data":"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6"} Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.413255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerStarted","Data":"1d9e945d6d513fa83098f9e6b9184d9f64329f494000284c56ef2c0b9602deb4"} Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.414498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192","Type":"ContainerStarted","Data":"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f"} Dec 03 22:28:29 crc kubenswrapper[4830]: I1203 22:28:29.440005 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.439979088 podStartE2EDuration="2.439979088s" podCreationTimestamp="2025-12-03 22:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:29.43083127 +0000 UTC m=+1398.427292629" watchObservedRunningTime="2025-12-03 22:28:29.439979088 +0000 UTC m=+1398.436440467" Dec 03 22:28:30 crc kubenswrapper[4830]: I1203 22:28:30.452552 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Dec 03 22:28:31 crc kubenswrapper[4830]: I1203 22:28:31.437827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerStarted","Data":"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d"} Dec 03 22:28:32 crc kubenswrapper[4830]: I1203 22:28:32.451357 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerStarted","Data":"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b"} Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerStarted","Data":"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b"} Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463578 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463276 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-central-agent" containerID="cri-o://eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6" gracePeriod=30 Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463437 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="proxy-httpd" containerID="cri-o://bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b" gracePeriod=30 Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463481 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-notification-agent" containerID="cri-o://e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d" gracePeriod=30 Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.463401 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="sg-core" containerID="cri-o://716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b" gracePeriod=30 Dec 03 22:28:33 crc kubenswrapper[4830]: I1203 22:28:33.496615 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.074943838 podStartE2EDuration="6.496589021s" podCreationTimestamp="2025-12-03 22:28:27 +0000 UTC" firstStartedPulling="2025-12-03 22:28:28.499065611 +0000 UTC m=+1397.495526980" lastFinishedPulling="2025-12-03 22:28:32.920710814 +0000 UTC m=+1401.917172163" observedRunningTime="2025-12-03 22:28:33.483560108 +0000 UTC m=+1402.480021457" watchObservedRunningTime="2025-12-03 22:28:33.496589021 +0000 UTC m=+1402.493050390" Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.478797 4830 generic.go:334] "Generic (PLEG): container finished" podID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerID="bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b" exitCode=0 Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.479157 4830 generic.go:334] "Generic (PLEG): container finished" podID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerID="716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b" exitCode=2 Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.479173 4830 generic.go:334] "Generic (PLEG): container finished" podID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerID="e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d" exitCode=0 Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.478865 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerDied","Data":"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b"} Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.479222 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerDied","Data":"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b"} Dec 03 22:28:34 crc kubenswrapper[4830]: I1203 22:28:34.479245 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerDied","Data":"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d"} Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.243680 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348152 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348486 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348597 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348682 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348702 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-524g8\" (UniqueName: \"kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.348727 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd\") pod \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\" (UID: \"30c149b6-9f60-45fa-8289-34c1ceb3a7f2\") " Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.349010 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.349184 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.349842 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.349896 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.353952 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8" (OuterVolumeSpecName: "kube-api-access-524g8") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "kube-api-access-524g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.354580 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts" (OuterVolumeSpecName: "scripts") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.379936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.439064 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.452110 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.452291 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.452376 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.452443 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-524g8\" (UniqueName: \"kubernetes.io/projected/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-kube-api-access-524g8\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.475442 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data" (OuterVolumeSpecName: "config-data") pod "30c149b6-9f60-45fa-8289-34c1ceb3a7f2" (UID: "30c149b6-9f60-45fa-8289-34c1ceb3a7f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.484479 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1d44c909-f792-4543-8f3e-a168e708be4f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.183:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.541040 4830 generic.go:334] "Generic (PLEG): container finished" podID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerID="eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6" exitCode=0 Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.541102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerDied","Data":"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6"} Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.541133 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30c149b6-9f60-45fa-8289-34c1ceb3a7f2","Type":"ContainerDied","Data":"1d9e945d6d513fa83098f9e6b9184d9f64329f494000284c56ef2c0b9602deb4"} Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.541154 4830 scope.go:117] "RemoveContainer" containerID="bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.541289 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.553796 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c149b6-9f60-45fa-8289-34c1ceb3a7f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.580222 4830 scope.go:117] "RemoveContainer" containerID="716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.585964 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.597302 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.603820 4830 scope.go:117] "RemoveContainer" containerID="e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.612472 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.613090 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-central-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.613173 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-central-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.613277 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="proxy-httpd" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.613416 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="proxy-httpd" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.613478 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="sg-core" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.613546 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="sg-core" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.613606 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-notification-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.613665 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-notification-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.614111 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-notification-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.614212 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="sg-core" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.614288 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="proxy-httpd" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.614356 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" containerName="ceilometer-central-agent" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.616326 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.621239 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.626348 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655049 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655091 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655134 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzqtd\" (UniqueName: \"kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655206 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655229 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.655273 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.656637 4830 scope.go:117] "RemoveContainer" containerID="eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.673697 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.694634 4830 scope.go:117] "RemoveContainer" containerID="bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.695139 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b\": container with ID starting with bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b not found: ID does not exist" containerID="bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.695238 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b"} err="failed to get container status \"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b\": rpc error: code = NotFound desc = could not find container \"bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b\": container with ID starting with bd6347f79208397e10d832c7bc77ce7bdb2e3e060a46a96e07acbd01394eb13b not found: ID does not exist" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.695336 4830 scope.go:117] "RemoveContainer" containerID="716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.695937 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b\": container with ID starting with 716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b not found: ID does not exist" containerID="716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.695989 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b"} err="failed to get container status \"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b\": rpc error: code = NotFound desc = could not find container \"716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b\": container with ID starting with 716949b18cd631c5a402aad1deab296143083ac75a24a48f2d3aa8abcc2cb66b not found: ID does not exist" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.696020 4830 scope.go:117] "RemoveContainer" containerID="e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.696272 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d\": container with ID starting with e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d not found: ID does not exist" containerID="e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.696299 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d"} err="failed to get container status \"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d\": rpc error: code = NotFound desc = could not find container \"e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d\": container with ID starting with e92f3dce016d8606495090f61e10eb12f5f7186bda3587c32f4c0734581ad71d not found: ID does not exist" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.696316 4830 scope.go:117] "RemoveContainer" containerID="eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6" Dec 03 22:28:38 crc kubenswrapper[4830]: E1203 22:28:38.696544 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6\": container with ID starting with eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6 not found: ID does not exist" containerID="eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.696571 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6"} err="failed to get container status \"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6\": rpc error: code = NotFound desc = could not find container \"eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6\": container with ID starting with eb0126fbc50014753db351d21c959554b1104c3be979fd042e34abe0dea9f6e6 not found: ID does not exist" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.756997 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzqtd\" (UniqueName: \"kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757163 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.757755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.758520 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.762201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.762226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.762908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.763489 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.777787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzqtd\" (UniqueName: \"kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd\") pod \"ceilometer-0\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " pod="openstack/ceilometer-0" Dec 03 22:28:38 crc kubenswrapper[4830]: I1203 22:28:38.977065 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:39 crc kubenswrapper[4830]: I1203 22:28:39.348568 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c149b6-9f60-45fa-8289-34c1ceb3a7f2" path="/var/lib/kubelet/pods/30c149b6-9f60-45fa-8289-34c1ceb3a7f2/volumes" Dec 03 22:28:39 crc kubenswrapper[4830]: I1203 22:28:39.456881 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:39 crc kubenswrapper[4830]: I1203 22:28:39.552694 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerStarted","Data":"d36e33978441f900ee13c40cf54fbe554dc36d03299201fcae2b88658546cc02"} Dec 03 22:28:39 crc kubenswrapper[4830]: I1203 22:28:39.555483 4830 generic.go:334] "Generic (PLEG): container finished" podID="026064cc-f701-4574-baf6-261e157061a8" containerID="ce7de5cea40f94b237ce0a817171e7fc448448770ce0e17bd595172e4545c28c" exitCode=0 Dec 03 22:28:39 crc kubenswrapper[4830]: I1203 22:28:39.555577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nb55g" event={"ID":"026064cc-f701-4574-baf6-261e157061a8","Type":"ContainerDied","Data":"ce7de5cea40f94b237ce0a817171e7fc448448770ce0e17bd595172e4545c28c"} Dec 03 22:28:40 crc kubenswrapper[4830]: I1203 22:28:40.253946 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:40 crc kubenswrapper[4830]: I1203 22:28:40.565495 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerStarted","Data":"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0"} Dec 03 22:28:40 crc kubenswrapper[4830]: I1203 22:28:40.943305 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.017847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts\") pod \"026064cc-f701-4574-baf6-261e157061a8\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.017936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdlx4\" (UniqueName: \"kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4\") pod \"026064cc-f701-4574-baf6-261e157061a8\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.018007 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle\") pod \"026064cc-f701-4574-baf6-261e157061a8\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.018099 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data\") pod \"026064cc-f701-4574-baf6-261e157061a8\" (UID: \"026064cc-f701-4574-baf6-261e157061a8\") " Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.023880 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4" (OuterVolumeSpecName: "kube-api-access-rdlx4") pod "026064cc-f701-4574-baf6-261e157061a8" (UID: "026064cc-f701-4574-baf6-261e157061a8"). InnerVolumeSpecName "kube-api-access-rdlx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.043625 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts" (OuterVolumeSpecName: "scripts") pod "026064cc-f701-4574-baf6-261e157061a8" (UID: "026064cc-f701-4574-baf6-261e157061a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.060800 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "026064cc-f701-4574-baf6-261e157061a8" (UID: "026064cc-f701-4574-baf6-261e157061a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.063328 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data" (OuterVolumeSpecName: "config-data") pod "026064cc-f701-4574-baf6-261e157061a8" (UID: "026064cc-f701-4574-baf6-261e157061a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.120460 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.120595 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.120658 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdlx4\" (UniqueName: \"kubernetes.io/projected/026064cc-f701-4574-baf6-261e157061a8-kube-api-access-rdlx4\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.120683 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026064cc-f701-4574-baf6-261e157061a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.583173 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nb55g" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.583240 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nb55g" event={"ID":"026064cc-f701-4574-baf6-261e157061a8","Type":"ContainerDied","Data":"0f098f7a2248ec2063ab9eee547632c1ce8d0c90dc7a012e62d07ad641454a6e"} Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.584387 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f098f7a2248ec2063ab9eee547632c1ce8d0c90dc7a012e62d07ad641454a6e" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.586343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerStarted","Data":"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c"} Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.705453 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:28:41 crc kubenswrapper[4830]: E1203 22:28:41.705851 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026064cc-f701-4574-baf6-261e157061a8" containerName="nova-cell0-conductor-db-sync" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.705868 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="026064cc-f701-4574-baf6-261e157061a8" containerName="nova-cell0-conductor-db-sync" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.706106 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="026064cc-f701-4574-baf6-261e157061a8" containerName="nova-cell0-conductor-db-sync" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.707038 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.710213 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hw9lf" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.710239 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.721329 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.738209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22jt\" (UniqueName: \"kubernetes.io/projected/4085042f-6768-44ca-be35-b0a9c78655fa-kube-api-access-l22jt\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.738413 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.738693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.840816 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.840910 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22jt\" (UniqueName: \"kubernetes.io/projected/4085042f-6768-44ca-be35-b0a9c78655fa-kube-api-access-l22jt\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.840927 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.845596 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.845786 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4085042f-6768-44ca-be35-b0a9c78655fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:41 crc kubenswrapper[4830]: I1203 22:28:41.858533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22jt\" (UniqueName: \"kubernetes.io/projected/4085042f-6768-44ca-be35-b0a9c78655fa-kube-api-access-l22jt\") pod \"nova-cell0-conductor-0\" (UID: \"4085042f-6768-44ca-be35-b0a9c78655fa\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:42 crc kubenswrapper[4830]: I1203 22:28:42.030027 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:42 crc kubenswrapper[4830]: I1203 22:28:42.507970 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:28:42 crc kubenswrapper[4830]: W1203 22:28:42.512862 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4085042f_6768_44ca_be35_b0a9c78655fa.slice/crio-4e32ca1ad3e3b4a0f95fc66710b8d671a8cac018df7de97399e808c0973acc49 WatchSource:0}: Error finding container 4e32ca1ad3e3b4a0f95fc66710b8d671a8cac018df7de97399e808c0973acc49: Status 404 returned error can't find the container with id 4e32ca1ad3e3b4a0f95fc66710b8d671a8cac018df7de97399e808c0973acc49 Dec 03 22:28:42 crc kubenswrapper[4830]: I1203 22:28:42.602317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerStarted","Data":"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07"} Dec 03 22:28:42 crc kubenswrapper[4830]: I1203 22:28:42.603626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4085042f-6768-44ca-be35-b0a9c78655fa","Type":"ContainerStarted","Data":"4e32ca1ad3e3b4a0f95fc66710b8d671a8cac018df7de97399e808c0973acc49"} Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.616881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerStarted","Data":"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b"} Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.617521 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.617128 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="sg-core" containerID="cri-o://ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07" gracePeriod=30 Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.617062 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-central-agent" containerID="cri-o://75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0" gracePeriod=30 Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.617178 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="proxy-httpd" containerID="cri-o://d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b" gracePeriod=30 Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.617196 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-notification-agent" containerID="cri-o://b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c" gracePeriod=30 Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.621215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4085042f-6768-44ca-be35-b0a9c78655fa","Type":"ContainerStarted","Data":"a286f9bcb616f04ee4be6b92bc2e527c01ed6b835d7fbf59bf18eae62f9ea7af"} Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.622173 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.662615 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.224422492 podStartE2EDuration="5.662593917s" podCreationTimestamp="2025-12-03 22:28:38 +0000 UTC" firstStartedPulling="2025-12-03 22:28:39.455086129 +0000 UTC m=+1408.451547478" lastFinishedPulling="2025-12-03 22:28:42.893257554 +0000 UTC m=+1411.889718903" observedRunningTime="2025-12-03 22:28:43.649355738 +0000 UTC m=+1412.645817137" watchObservedRunningTime="2025-12-03 22:28:43.662593917 +0000 UTC m=+1412.659055296" Dec 03 22:28:43 crc kubenswrapper[4830]: I1203 22:28:43.685126 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.685105886 podStartE2EDuration="2.685105886s" podCreationTimestamp="2025-12-03 22:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:43.666221695 +0000 UTC m=+1412.662683044" watchObservedRunningTime="2025-12-03 22:28:43.685105886 +0000 UTC m=+1412.681567255" Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.632674 4830 generic.go:334] "Generic (PLEG): container finished" podID="529dcab1-017d-47a0-a2e9-2110c3040135" containerID="d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b" exitCode=0 Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.633058 4830 generic.go:334] "Generic (PLEG): container finished" podID="529dcab1-017d-47a0-a2e9-2110c3040135" containerID="ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07" exitCode=2 Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.633068 4830 generic.go:334] "Generic (PLEG): container finished" podID="529dcab1-017d-47a0-a2e9-2110c3040135" containerID="b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c" exitCode=0 Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.632753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerDied","Data":"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b"} Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.633188 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerDied","Data":"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07"} Dec 03 22:28:44 crc kubenswrapper[4830]: I1203 22:28:44.633203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerDied","Data":"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c"} Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.631237 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.671421 4830 generic.go:334] "Generic (PLEG): container finished" podID="529dcab1-017d-47a0-a2e9-2110c3040135" containerID="75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0" exitCode=0 Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.671500 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerDied","Data":"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0"} Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.671570 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"529dcab1-017d-47a0-a2e9-2110c3040135","Type":"ContainerDied","Data":"d36e33978441f900ee13c40cf54fbe554dc36d03299201fcae2b88658546cc02"} Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.671596 4830 scope.go:117] "RemoveContainer" containerID="d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.671857 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.708004 4830 scope.go:117] "RemoveContainer" containerID="ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.728133 4830 scope.go:117] "RemoveContainer" containerID="b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.747112 4830 scope.go:117] "RemoveContainer" containerID="75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.765584 4830 scope.go:117] "RemoveContainer" containerID="d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b" Dec 03 22:28:46 crc kubenswrapper[4830]: E1203 22:28:46.766013 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b\": container with ID starting with d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b not found: ID does not exist" containerID="d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.766054 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b"} err="failed to get container status \"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b\": rpc error: code = NotFound desc = could not find container \"d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b\": container with ID starting with d01ddb5e8d3f028177bc0b77a346d7dcfdad55823eec8bc2bbc01be1f294939b not found: ID does not exist" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.766084 4830 scope.go:117] "RemoveContainer" containerID="ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07" Dec 03 22:28:46 crc kubenswrapper[4830]: E1203 22:28:46.766597 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07\": container with ID starting with ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07 not found: ID does not exist" containerID="ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.766633 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07"} err="failed to get container status \"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07\": rpc error: code = NotFound desc = could not find container \"ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07\": container with ID starting with ab46cc152eba94fb8370fdcfb7fdeed20514c6ec45a8f4f9d1258bc45d34aa07 not found: ID does not exist" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.766661 4830 scope.go:117] "RemoveContainer" containerID="b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c" Dec 03 22:28:46 crc kubenswrapper[4830]: E1203 22:28:46.766974 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c\": container with ID starting with b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c not found: ID does not exist" containerID="b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.767007 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c"} err="failed to get container status \"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c\": rpc error: code = NotFound desc = could not find container \"b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c\": container with ID starting with b1bace39d356106b7053dcc2c89a8a77101703b1cb443bb2386e16a1072c831c not found: ID does not exist" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.767028 4830 scope.go:117] "RemoveContainer" containerID="75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0" Dec 03 22:28:46 crc kubenswrapper[4830]: E1203 22:28:46.767246 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0\": container with ID starting with 75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0 not found: ID does not exist" containerID="75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.767278 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0"} err="failed to get container status \"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0\": rpc error: code = NotFound desc = could not find container \"75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0\": container with ID starting with 75c46556f8432aea4cbf281e2cf53692d7bb85646aa09ded86d8f5b9a4be46a0 not found: ID does not exist" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.835837 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836066 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836128 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836190 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836244 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzqtd\" (UniqueName: \"kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd\") pod \"529dcab1-017d-47a0-a2e9-2110c3040135\" (UID: \"529dcab1-017d-47a0-a2e9-2110c3040135\") " Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.836836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.837148 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.837226 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.841492 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts" (OuterVolumeSpecName: "scripts") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.843083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd" (OuterVolumeSpecName: "kube-api-access-qzqtd") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "kube-api-access-qzqtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.876498 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.938701 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/529dcab1-017d-47a0-a2e9-2110c3040135-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.938738 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.938750 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.938764 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzqtd\" (UniqueName: \"kubernetes.io/projected/529dcab1-017d-47a0-a2e9-2110c3040135-kube-api-access-qzqtd\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.944034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:46 crc kubenswrapper[4830]: I1203 22:28:46.993697 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data" (OuterVolumeSpecName: "config-data") pod "529dcab1-017d-47a0-a2e9-2110c3040135" (UID: "529dcab1-017d-47a0-a2e9-2110c3040135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.040194 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.040581 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529dcab1-017d-47a0-a2e9-2110c3040135-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.062994 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.312303 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.323066 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.349236 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" path="/var/lib/kubelet/pods/529dcab1-017d-47a0-a2e9-2110c3040135/volumes" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350121 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: E1203 22:28:47.350568 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-notification-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350590 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-notification-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: E1203 22:28:47.350617 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-central-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350626 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-central-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: E1203 22:28:47.350667 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="sg-core" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350676 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="sg-core" Dec 03 22:28:47 crc kubenswrapper[4830]: E1203 22:28:47.350691 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="proxy-httpd" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350709 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="proxy-httpd" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.350992 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-central-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.351019 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="proxy-httpd" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.351038 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="ceilometer-notification-agent" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.351069 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="529dcab1-017d-47a0-a2e9-2110c3040135" containerName="sg-core" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.357310 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.362012 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.368420 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.374453 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.492530 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mlqw8"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.494231 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.505162 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.505770 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mlqw8"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.508379 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.552941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6s4m\" (UniqueName: \"kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553014 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553404 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553545 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553584 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.553618 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655145 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhw8\" (UniqueName: \"kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655289 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655307 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655322 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655356 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6s4m\" (UniqueName: \"kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.655470 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.656238 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.657636 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.685193 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.686145 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.689386 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.694868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.704287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6s4m\" (UniqueName: \"kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m\") pod \"ceilometer-0\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " pod="openstack/ceilometer-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.741538 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.744092 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.762351 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.763875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.764022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhw8\" (UniqueName: \"kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.764119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.764261 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.781286 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.782094 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.782344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.783562 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.802765 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.804170 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.804646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhw8\" (UniqueName: \"kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8\") pod \"nova-cell0-cell-mapping-mlqw8\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.808574 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.813237 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.826875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.871836 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjzp\" (UniqueName: \"kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.871896 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.871981 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.872040 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.916677 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.918424 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.920476 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.952117 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.953817 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.958912 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.975403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.975698 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.975826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.975910 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftgd\" (UniqueName: \"kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.975994 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.976136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjzp\" (UniqueName: \"kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.976230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.976685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.981611 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.981976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:47 crc kubenswrapper[4830]: I1203 22:28:47.988643 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.004889 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjzp\" (UniqueName: \"kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp\") pod \"nova-api-0\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " pod="openstack/nova-api-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.017080 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.052194 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.068881 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.071763 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.098746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.098785 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.098933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.099024 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsrv\" (UniqueName: \"kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.100339 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.100451 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.100978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.101043 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mftgd\" (UniqueName: \"kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.101127 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtnf\" (UniqueName: \"kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.101793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.111016 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.113731 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.124593 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.125568 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftgd\" (UniqueName: \"kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd\") pod \"nova-scheduler-0\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.197498 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204483 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtnf\" (UniqueName: \"kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204597 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204644 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204667 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204728 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrq2g\" (UniqueName: \"kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204754 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204798 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsrv\" (UniqueName: \"kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204838 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.204950 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.206043 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.208890 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.211465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.212197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.215101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.227117 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.228778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsrv\" (UniqueName: \"kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv\") pod \"nova-metadata-0\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.229586 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtnf\" (UniqueName: \"kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf\") pod \"nova-cell1-novncproxy-0\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.262976 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.281962 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.312718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrq2g\" (UniqueName: \"kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.312870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.312892 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.312916 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.312961 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.313005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.314641 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.314834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.315235 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.329870 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.330166 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.335885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrq2g\" (UniqueName: \"kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g\") pod \"dnsmasq-dns-78cd565959-6bnv5\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.406309 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.521730 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mlqw8"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.714817 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:48 crc kubenswrapper[4830]: W1203 22:28:48.749307 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc69e374c_c6f7_4d9e_a302_45844b3ca243.slice/crio-1958a1e866d181af27c17ce431fcccc352ae7404988a7e18d39e7b2c25bff573 WatchSource:0}: Error finding container 1958a1e866d181af27c17ce431fcccc352ae7404988a7e18d39e7b2c25bff573: Status 404 returned error can't find the container with id 1958a1e866d181af27c17ce431fcccc352ae7404988a7e18d39e7b2c25bff573 Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.751144 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mlqw8" event={"ID":"79ce7502-4906-4ec8-941a-04aa6486cf93","Type":"ContainerStarted","Data":"b1c67ba2cc404d3d4405bcc34b46ac355b4f4df64b667139cfd1dbcdfc2a456c"} Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.805117 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-v8fhw"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.806520 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.810953 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.811130 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.826375 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-v8fhw"] Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.925557 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.925653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpzd\" (UniqueName: \"kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.925720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:48 crc kubenswrapper[4830]: I1203 22:28:48.925736 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.000539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.027912 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.028023 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpzd\" (UniqueName: \"kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.028099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.028118 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.037657 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.042994 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.049033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.055872 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpzd\" (UniqueName: \"kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd\") pod \"nova-cell1-conductor-db-sync-v8fhw\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.132303 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.382553 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.466972 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.476319 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.502705 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:28:49 crc kubenswrapper[4830]: W1203 22:28:49.504805 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58d17afd_4fea_430f_951a_98d40b505b9d.slice/crio-8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb WatchSource:0}: Error finding container 8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb: Status 404 returned error can't find the container with id 8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.787886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerStarted","Data":"9cefab27e4cf697e9f9cf881316dbe2a23f4c6786296debd89dda073575986c8"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.788221 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerStarted","Data":"1958a1e866d181af27c17ce431fcccc352ae7404988a7e18d39e7b2c25bff573"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.790919 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mlqw8" event={"ID":"79ce7502-4906-4ec8-941a-04aa6486cf93","Type":"ContainerStarted","Data":"d1d5ec0b211e31aa4976e4d8ac3cf611cfe8a423df0c18f32ddc6b1e759d6e3f"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.794139 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-v8fhw"] Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.795075 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"203323f1-89ed-49c0-b7c0-31273a77c22c","Type":"ContainerStarted","Data":"469972217b30fa43c2e4ce71d9558d959ab0023212990c1d848224b6fc364648"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.797598 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerStarted","Data":"8142bbc1a9e49358537f6fbeec417c4c11dd37da86bf46815b7c43bdcdcc51f0"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.797626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerStarted","Data":"8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.799606 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerStarted","Data":"353d87ff666f3a8f2416e3af3132b3af260b67c52bcfcc7ff523dfd598a1a035"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.808410 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"865db8fd-ea83-4b7a-a24a-546284b3f835","Type":"ContainerStarted","Data":"8de33e8876e793f4a83c3c619a7bc3f6e5a9f2bb0ae396da4cbb1f6ef0d59d03"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.813851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerStarted","Data":"f7a64ac28caf4b6309317eb0d07c03fe8ddebafd5bdea3256608d220a058520e"} Dec 03 22:28:49 crc kubenswrapper[4830]: I1203 22:28:49.834609 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mlqw8" podStartSLOduration=2.834587713 podStartE2EDuration="2.834587713s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:49.810113761 +0000 UTC m=+1418.806575110" watchObservedRunningTime="2025-12-03 22:28:49.834587713 +0000 UTC m=+1418.831049062" Dec 03 22:28:49 crc kubenswrapper[4830]: W1203 22:28:49.852207 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a58c35e_a16d_4014_a958_67f2a5461287.slice/crio-70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b WatchSource:0}: Error finding container 70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b: Status 404 returned error can't find the container with id 70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.831516 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" event={"ID":"2a58c35e-a16d-4014-a958-67f2a5461287","Type":"ContainerStarted","Data":"79fa30ed86e17dd6eea827b5b8b9a7f4f692e4153bc0ee24a942805f9cd53759"} Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.832047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" event={"ID":"2a58c35e-a16d-4014-a958-67f2a5461287","Type":"ContainerStarted","Data":"70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b"} Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.839085 4830 generic.go:334] "Generic (PLEG): container finished" podID="58d17afd-4fea-430f-951a-98d40b505b9d" containerID="8142bbc1a9e49358537f6fbeec417c4c11dd37da86bf46815b7c43bdcdcc51f0" exitCode=0 Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.839346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerDied","Data":"8142bbc1a9e49358537f6fbeec417c4c11dd37da86bf46815b7c43bdcdcc51f0"} Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.842106 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerStarted","Data":"00f5771a3a9e81adda147d72dcc505c7c574eac214f135250c7782d1270d0996"} Dec 03 22:28:50 crc kubenswrapper[4830]: I1203 22:28:50.881295 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" podStartSLOduration=2.881274173 podStartE2EDuration="2.881274173s" podCreationTimestamp="2025-12-03 22:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:50.852161144 +0000 UTC m=+1419.848622493" watchObservedRunningTime="2025-12-03 22:28:50.881274173 +0000 UTC m=+1419.877735572" Dec 03 22:28:51 crc kubenswrapper[4830]: I1203 22:28:51.404460 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:51 crc kubenswrapper[4830]: I1203 22:28:51.417356 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.906032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerStarted","Data":"cc5bf7687dfb8cc365f2b312194cd250fa43e7f499bce34a1b981d42d8031039"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.906674 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.907836 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerStarted","Data":"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.907884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerStarted","Data":"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.909823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"865db8fd-ea83-4b7a-a24a-546284b3f835","Type":"ContainerStarted","Data":"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.911640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerStarted","Data":"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.911673 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerStarted","Data":"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.911777 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-log" containerID="cri-o://5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" gracePeriod=30 Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.911825 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-metadata" containerID="cri-o://eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" gracePeriod=30 Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.920215 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerStarted","Data":"5012856d8b265195bd2e96366a5da4fb89584064fe6e0130f7b7184ec84411ca"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.924030 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"203323f1-89ed-49c0-b7c0-31273a77c22c","Type":"ContainerStarted","Data":"acede5cebe2ec492a7f5581f438c944410198aec1103c227865f8442b48df2eb"} Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.924210 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="203323f1-89ed-49c0-b7c0-31273a77c22c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://acede5cebe2ec492a7f5581f438c944410198aec1103c227865f8442b48df2eb" gracePeriod=30 Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.930746 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" podStartSLOduration=7.930725102 podStartE2EDuration="7.930725102s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:54.928170122 +0000 UTC m=+1423.924631471" watchObservedRunningTime="2025-12-03 22:28:54.930725102 +0000 UTC m=+1423.927186451" Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.952129 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.361797903 podStartE2EDuration="7.952113151s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="2025-12-03 22:28:49.006788649 +0000 UTC m=+1418.003249998" lastFinishedPulling="2025-12-03 22:28:53.597103857 +0000 UTC m=+1422.593565246" observedRunningTime="2025-12-03 22:28:54.945011838 +0000 UTC m=+1423.941473187" watchObservedRunningTime="2025-12-03 22:28:54.952113151 +0000 UTC m=+1423.948574500" Dec 03 22:28:54 crc kubenswrapper[4830]: I1203 22:28:54.990494 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.903131533 podStartE2EDuration="7.990474859s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="2025-12-03 22:28:49.460476898 +0000 UTC m=+1418.456938247" lastFinishedPulling="2025-12-03 22:28:53.547820224 +0000 UTC m=+1422.544281573" observedRunningTime="2025-12-03 22:28:54.976767788 +0000 UTC m=+1423.973229137" watchObservedRunningTime="2025-12-03 22:28:54.990474859 +0000 UTC m=+1423.986936208" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.004326 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.922262313 podStartE2EDuration="8.004306984s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="2025-12-03 22:28:49.464831466 +0000 UTC m=+1418.461292815" lastFinishedPulling="2025-12-03 22:28:53.546876127 +0000 UTC m=+1422.543337486" observedRunningTime="2025-12-03 22:28:54.991821125 +0000 UTC m=+1423.988282474" watchObservedRunningTime="2025-12-03 22:28:55.004306984 +0000 UTC m=+1424.000768333" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.036049 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.875113985 podStartE2EDuration="8.036026212s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="2025-12-03 22:28:49.386635039 +0000 UTC m=+1418.383096378" lastFinishedPulling="2025-12-03 22:28:53.547547256 +0000 UTC m=+1422.544008605" observedRunningTime="2025-12-03 22:28:55.014089568 +0000 UTC m=+1424.010550917" watchObservedRunningTime="2025-12-03 22:28:55.036026212 +0000 UTC m=+1424.032487561" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.467668 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.574898 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle\") pod \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.574957 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpsrv\" (UniqueName: \"kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv\") pod \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.575017 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data\") pod \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.575751 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs\") pod \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\" (UID: \"9d336e40-a59d-493e-be36-fffc2a5a6bd3\") " Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.576617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs" (OuterVolumeSpecName: "logs") pod "9d336e40-a59d-493e-be36-fffc2a5a6bd3" (UID: "9d336e40-a59d-493e-be36-fffc2a5a6bd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.580321 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv" (OuterVolumeSpecName: "kube-api-access-zpsrv") pod "9d336e40-a59d-493e-be36-fffc2a5a6bd3" (UID: "9d336e40-a59d-493e-be36-fffc2a5a6bd3"). InnerVolumeSpecName "kube-api-access-zpsrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.605597 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data" (OuterVolumeSpecName: "config-data") pod "9d336e40-a59d-493e-be36-fffc2a5a6bd3" (UID: "9d336e40-a59d-493e-be36-fffc2a5a6bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.620558 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d336e40-a59d-493e-be36-fffc2a5a6bd3" (UID: "9d336e40-a59d-493e-be36-fffc2a5a6bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.678520 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d336e40-a59d-493e-be36-fffc2a5a6bd3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.678555 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.678567 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpsrv\" (UniqueName: \"kubernetes.io/projected/9d336e40-a59d-493e-be36-fffc2a5a6bd3-kube-api-access-zpsrv\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.678575 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d336e40-a59d-493e-be36-fffc2a5a6bd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.938156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerStarted","Data":"fa2a3c97db2b7dd6dc82b68e486ae304d82402699b1d50f726e2937da06eb202"} Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.938820 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941453 4830 generic.go:334] "Generic (PLEG): container finished" podID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerID="eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" exitCode=0 Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941480 4830 generic.go:334] "Generic (PLEG): container finished" podID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerID="5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" exitCode=143 Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerDied","Data":"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280"} Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerDied","Data":"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8"} Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941608 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d336e40-a59d-493e-be36-fffc2a5a6bd3","Type":"ContainerDied","Data":"f7a64ac28caf4b6309317eb0d07c03fe8ddebafd5bdea3256608d220a058520e"} Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941629 4830 scope.go:117] "RemoveContainer" containerID="eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.941794 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.972721 4830 scope.go:117] "RemoveContainer" containerID="5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" Dec 03 22:28:55 crc kubenswrapper[4830]: I1203 22:28:55.975389 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.763960832 podStartE2EDuration="8.975367015s" podCreationTimestamp="2025-12-03 22:28:47 +0000 UTC" firstStartedPulling="2025-12-03 22:28:48.75299173 +0000 UTC m=+1417.749453079" lastFinishedPulling="2025-12-03 22:28:54.964397913 +0000 UTC m=+1423.960859262" observedRunningTime="2025-12-03 22:28:55.962026484 +0000 UTC m=+1424.958487833" watchObservedRunningTime="2025-12-03 22:28:55.975367015 +0000 UTC m=+1424.971828364" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.004711 4830 scope.go:117] "RemoveContainer" containerID="eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" Dec 03 22:28:56 crc kubenswrapper[4830]: E1203 22:28:56.005183 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280\": container with ID starting with eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280 not found: ID does not exist" containerID="eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.005298 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280"} err="failed to get container status \"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280\": rpc error: code = NotFound desc = could not find container \"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280\": container with ID starting with eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280 not found: ID does not exist" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.005449 4830 scope.go:117] "RemoveContainer" containerID="5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" Dec 03 22:28:56 crc kubenswrapper[4830]: E1203 22:28:56.008892 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8\": container with ID starting with 5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8 not found: ID does not exist" containerID="5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.009113 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8"} err="failed to get container status \"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8\": rpc error: code = NotFound desc = could not find container \"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8\": container with ID starting with 5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8 not found: ID does not exist" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.009226 4830 scope.go:117] "RemoveContainer" containerID="eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.009880 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280"} err="failed to get container status \"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280\": rpc error: code = NotFound desc = could not find container \"eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280\": container with ID starting with eac553645636663eb0f95ec19c46febd23cdb1da9df795fb4fb8684b3794d280 not found: ID does not exist" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.009940 4830 scope.go:117] "RemoveContainer" containerID="5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.010265 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8"} err="failed to get container status \"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8\": rpc error: code = NotFound desc = could not find container \"5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8\": container with ID starting with 5c495a4862dc91939d4b9326f341ccee04fc28647bd6cd3ccf4c7a794946c3e8 not found: ID does not exist" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.029964 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.059558 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.112055 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:56 crc kubenswrapper[4830]: E1203 22:28:56.114313 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-log" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.114333 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-log" Dec 03 22:28:56 crc kubenswrapper[4830]: E1203 22:28:56.114358 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-metadata" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.114366 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-metadata" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.120757 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-metadata" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.120835 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" containerName="nova-metadata-log" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.127969 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.135542 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.137930 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.173893 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.244308 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.244372 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6552\" (UniqueName: \"kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.244446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.244492 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.244627 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.346537 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.346679 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.346715 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6552\" (UniqueName: \"kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.346761 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.346790 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.357796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.425527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.431162 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.434324 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.446025 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6552\" (UniqueName: \"kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552\") pod \"nova-metadata-0\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " pod="openstack/nova-metadata-0" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.712528 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.712581 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:28:56 crc kubenswrapper[4830]: I1203 22:28:56.730013 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:57 crc kubenswrapper[4830]: W1203 22:28:57.342760 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61d0377_2105_49a0_aeb0_5dfd2b07fdcc.slice/crio-272b17ba4526241aca3f893710da67ef365311dc791f264eb6e3ebb8506914a1 WatchSource:0}: Error finding container 272b17ba4526241aca3f893710da67ef365311dc791f264eb6e3ebb8506914a1: Status 404 returned error can't find the container with id 272b17ba4526241aca3f893710da67ef365311dc791f264eb6e3ebb8506914a1 Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.354519 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d336e40-a59d-493e-be36-fffc2a5a6bd3" path="/var/lib/kubelet/pods/9d336e40-a59d-493e-be36-fffc2a5a6bd3/volumes" Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.355130 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.983464 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerStarted","Data":"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3"} Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.983842 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerStarted","Data":"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae"} Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.983857 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerStarted","Data":"272b17ba4526241aca3f893710da67ef365311dc791f264eb6e3ebb8506914a1"} Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.985974 4830 generic.go:334] "Generic (PLEG): container finished" podID="79ce7502-4906-4ec8-941a-04aa6486cf93" containerID="d1d5ec0b211e31aa4976e4d8ac3cf611cfe8a423df0c18f32ddc6b1e759d6e3f" exitCode=0 Dec 03 22:28:57 crc kubenswrapper[4830]: I1203 22:28:57.986002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mlqw8" event={"ID":"79ce7502-4906-4ec8-941a-04aa6486cf93","Type":"ContainerDied","Data":"d1d5ec0b211e31aa4976e4d8ac3cf611cfe8a423df0c18f32ddc6b1e759d6e3f"} Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.012674 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.012657633 podStartE2EDuration="3.012657633s" podCreationTimestamp="2025-12-03 22:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:58.002029536 +0000 UTC m=+1426.998490885" watchObservedRunningTime="2025-12-03 22:28:58.012657633 +0000 UTC m=+1427.009118982" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.199092 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.199148 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.228281 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.228337 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.284037 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.289243 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.995044 4830 generic.go:334] "Generic (PLEG): container finished" podID="2a58c35e-a16d-4014-a958-67f2a5461287" containerID="79fa30ed86e17dd6eea827b5b8b9a7f4f692e4153bc0ee24a942805f9cd53759" exitCode=0 Dec 03 22:28:58 crc kubenswrapper[4830]: I1203 22:28:58.995086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" event={"ID":"2a58c35e-a16d-4014-a958-67f2a5461287","Type":"ContainerDied","Data":"79fa30ed86e17dd6eea827b5b8b9a7f4f692e4153bc0ee24a942805f9cd53759"} Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.059697 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.281912 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.282291 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.496070 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.669519 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhw8\" (UniqueName: \"kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8\") pod \"79ce7502-4906-4ec8-941a-04aa6486cf93\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.669608 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts\") pod \"79ce7502-4906-4ec8-941a-04aa6486cf93\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.669698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data\") pod \"79ce7502-4906-4ec8-941a-04aa6486cf93\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.669763 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle\") pod \"79ce7502-4906-4ec8-941a-04aa6486cf93\" (UID: \"79ce7502-4906-4ec8-941a-04aa6486cf93\") " Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.695780 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8" (OuterVolumeSpecName: "kube-api-access-dwhw8") pod "79ce7502-4906-4ec8-941a-04aa6486cf93" (UID: "79ce7502-4906-4ec8-941a-04aa6486cf93"). InnerVolumeSpecName "kube-api-access-dwhw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.696209 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts" (OuterVolumeSpecName: "scripts") pod "79ce7502-4906-4ec8-941a-04aa6486cf93" (UID: "79ce7502-4906-4ec8-941a-04aa6486cf93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.711415 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data" (OuterVolumeSpecName: "config-data") pod "79ce7502-4906-4ec8-941a-04aa6486cf93" (UID: "79ce7502-4906-4ec8-941a-04aa6486cf93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.735163 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79ce7502-4906-4ec8-941a-04aa6486cf93" (UID: "79ce7502-4906-4ec8-941a-04aa6486cf93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.772971 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.773011 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.773029 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhw8\" (UniqueName: \"kubernetes.io/projected/79ce7502-4906-4ec8-941a-04aa6486cf93-kube-api-access-dwhw8\") on node \"crc\" DevicePath \"\"" Dec 03 22:28:59 crc kubenswrapper[4830]: I1203 22:28:59.773039 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ce7502-4906-4ec8-941a-04aa6486cf93-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.009111 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mlqw8" event={"ID":"79ce7502-4906-4ec8-941a-04aa6486cf93","Type":"ContainerDied","Data":"b1c67ba2cc404d3d4405bcc34b46ac355b4f4df64b667139cfd1dbcdfc2a456c"} Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.009187 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1c67ba2cc404d3d4405bcc34b46ac355b4f4df64b667139cfd1dbcdfc2a456c" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.009237 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mlqw8" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.214483 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.214715 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-log" containerID="cri-o://9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98" gracePeriod=30 Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.215139 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-api" containerID="cri-o://193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005" gracePeriod=30 Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.265241 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.265444 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-log" containerID="cri-o://f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" gracePeriod=30 Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.265534 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-metadata" containerID="cri-o://4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" gracePeriod=30 Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.362234 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.535524 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.593455 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data\") pod \"2a58c35e-a16d-4014-a958-67f2a5461287\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.593528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle\") pod \"2a58c35e-a16d-4014-a958-67f2a5461287\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.593566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxpzd\" (UniqueName: \"kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd\") pod \"2a58c35e-a16d-4014-a958-67f2a5461287\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.593798 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts\") pod \"2a58c35e-a16d-4014-a958-67f2a5461287\" (UID: \"2a58c35e-a16d-4014-a958-67f2a5461287\") " Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.602198 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd" (OuterVolumeSpecName: "kube-api-access-fxpzd") pod "2a58c35e-a16d-4014-a958-67f2a5461287" (UID: "2a58c35e-a16d-4014-a958-67f2a5461287"). InnerVolumeSpecName "kube-api-access-fxpzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.603788 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts" (OuterVolumeSpecName: "scripts") pod "2a58c35e-a16d-4014-a958-67f2a5461287" (UID: "2a58c35e-a16d-4014-a958-67f2a5461287"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.636617 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data" (OuterVolumeSpecName: "config-data") pod "2a58c35e-a16d-4014-a958-67f2a5461287" (UID: "2a58c35e-a16d-4014-a958-67f2a5461287"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.638768 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a58c35e-a16d-4014-a958-67f2a5461287" (UID: "2a58c35e-a16d-4014-a958-67f2a5461287"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.696437 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.696679 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.696770 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxpzd\" (UniqueName: \"kubernetes.io/projected/2a58c35e-a16d-4014-a958-67f2a5461287-kube-api-access-fxpzd\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.696839 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a58c35e-a16d-4014-a958-67f2a5461287-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:00 crc kubenswrapper[4830]: I1203 22:29:00.904936 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.004372 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6552\" (UniqueName: \"kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552\") pod \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.004573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs\") pod \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.004660 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs\") pod \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.004761 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data\") pod \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.004848 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle\") pod \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\" (UID: \"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc\") " Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.005159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs" (OuterVolumeSpecName: "logs") pod "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" (UID: "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.005883 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.010846 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552" (OuterVolumeSpecName: "kube-api-access-s6552") pod "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" (UID: "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc"). InnerVolumeSpecName "kube-api-access-s6552". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.025819 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" event={"ID":"2a58c35e-a16d-4014-a958-67f2a5461287","Type":"ContainerDied","Data":"70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b"} Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.025859 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c1907b6d2a3aafd0d7da26c7d5996ed12dc59c229ceb3834268c267707e05b" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.025913 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-v8fhw" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.031808 4830 generic.go:334] "Generic (PLEG): container finished" podID="824ae862-2f34-4c76-b9b4-c275a058c466" containerID="9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98" exitCode=143 Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.031881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerDied","Data":"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98"} Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.034188 4830 generic.go:334] "Generic (PLEG): container finished" podID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerID="4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" exitCode=0 Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.034207 4830 generic.go:334] "Generic (PLEG): container finished" podID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerID="f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" exitCode=143 Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.034410 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="865db8fd-ea83-4b7a-a24a-546284b3f835" containerName="nova-scheduler-scheduler" containerID="cri-o://b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda" gracePeriod=30 Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.034702 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.035241 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerDied","Data":"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3"} Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.035272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerDied","Data":"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae"} Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.036317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61d0377-2105-49a0-aeb0-5dfd2b07fdcc","Type":"ContainerDied","Data":"272b17ba4526241aca3f893710da67ef365311dc791f264eb6e3ebb8506914a1"} Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.036355 4830 scope.go:117] "RemoveContainer" containerID="4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.049664 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data" (OuterVolumeSpecName: "config-data") pod "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" (UID: "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.060682 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" (UID: "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.073737 4830 scope.go:117] "RemoveContainer" containerID="f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.088370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" (UID: "f61d0377-2105-49a0-aeb0-5dfd2b07fdcc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.108815 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.108860 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.108873 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6552\" (UniqueName: \"kubernetes.io/projected/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-kube-api-access-s6552\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.108884 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.132726 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.133157 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-metadata" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133181 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-metadata" Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.133219 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a58c35e-a16d-4014-a958-67f2a5461287" containerName="nova-cell1-conductor-db-sync" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133227 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a58c35e-a16d-4014-a958-67f2a5461287" containerName="nova-cell1-conductor-db-sync" Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.133248 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ce7502-4906-4ec8-941a-04aa6486cf93" containerName="nova-manage" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133256 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ce7502-4906-4ec8-941a-04aa6486cf93" containerName="nova-manage" Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.133272 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-log" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133280 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-log" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133526 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ce7502-4906-4ec8-941a-04aa6486cf93" containerName="nova-manage" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133552 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a58c35e-a16d-4014-a958-67f2a5461287" containerName="nova-cell1-conductor-db-sync" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133565 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-metadata" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133579 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" containerName="nova-metadata-log" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.133723 4830 scope.go:117] "RemoveContainer" containerID="4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.134356 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3\": container with ID starting with 4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3 not found: ID does not exist" containerID="4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.134401 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3"} err="failed to get container status \"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3\": rpc error: code = NotFound desc = could not find container \"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3\": container with ID starting with 4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3 not found: ID does not exist" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.134429 4830 scope.go:117] "RemoveContainer" containerID="f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.134484 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: E1203 22:29:01.134966 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae\": container with ID starting with f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae not found: ID does not exist" containerID="f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.134989 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae"} err="failed to get container status \"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae\": rpc error: code = NotFound desc = could not find container \"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae\": container with ID starting with f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae not found: ID does not exist" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.135006 4830 scope.go:117] "RemoveContainer" containerID="4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.138012 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3"} err="failed to get container status \"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3\": rpc error: code = NotFound desc = could not find container \"4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3\": container with ID starting with 4ac6fe3f68fa7061e3461f6d11eef83064405a67d775fab8d986bb20d87044d3 not found: ID does not exist" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.138047 4830 scope.go:117] "RemoveContainer" containerID="f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.138408 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae"} err="failed to get container status \"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae\": rpc error: code = NotFound desc = could not find container \"f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae\": container with ID starting with f38930465154503bef53cfa9b1086b7d3cdaea83cec44c5aac53dd6c7329e7ae not found: ID does not exist" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.139197 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.142934 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.211827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.212104 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79md\" (UniqueName: \"kubernetes.io/projected/1eea67ad-cea1-4acf-a451-76a490e27693-kube-api-access-j79md\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.212314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.314830 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.315454 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.315846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79md\" (UniqueName: \"kubernetes.io/projected/1eea67ad-cea1-4acf-a451-76a490e27693-kube-api-access-j79md\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.318963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.319127 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea67ad-cea1-4acf-a451-76a490e27693-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.335784 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79md\" (UniqueName: \"kubernetes.io/projected/1eea67ad-cea1-4acf-a451-76a490e27693-kube-api-access-j79md\") pod \"nova-cell1-conductor-0\" (UID: \"1eea67ad-cea1-4acf-a451-76a490e27693\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.407761 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.424386 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.437121 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.439734 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.448747 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.449060 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.455162 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.482852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.623118 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.623495 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.623560 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm4d\" (UniqueName: \"kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.623712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.623795 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.726846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.726956 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.727044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.727075 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.727112 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm4d\" (UniqueName: \"kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.728007 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.732161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.736076 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.748643 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm4d\" (UniqueName: \"kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.749649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.901543 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:01 crc kubenswrapper[4830]: I1203 22:29:01.928222 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.124286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1eea67ad-cea1-4acf-a451-76a490e27693","Type":"ContainerStarted","Data":"43bf0d0a9988a603d06fab75df93e3e59ea1be6efab07d4f8738259b424f4056"} Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.228975 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.234425 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.251735 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.347822 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.348169 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2d8\" (UniqueName: \"kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.348261 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.449403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.449511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2d8\" (UniqueName: \"kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.449613 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.450084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.450283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.487968 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2d8\" (UniqueName: \"kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8\") pod \"redhat-operators-s8qgr\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.495450 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.573497 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.767137 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.963586 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle\") pod \"865db8fd-ea83-4b7a-a24a-546284b3f835\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.963917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data\") pod \"865db8fd-ea83-4b7a-a24a-546284b3f835\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.964035 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mftgd\" (UniqueName: \"kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd\") pod \"865db8fd-ea83-4b7a-a24a-546284b3f835\" (UID: \"865db8fd-ea83-4b7a-a24a-546284b3f835\") " Dec 03 22:29:02 crc kubenswrapper[4830]: I1203 22:29:02.971254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd" (OuterVolumeSpecName: "kube-api-access-mftgd") pod "865db8fd-ea83-4b7a-a24a-546284b3f835" (UID: "865db8fd-ea83-4b7a-a24a-546284b3f835"). InnerVolumeSpecName "kube-api-access-mftgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.004565 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data" (OuterVolumeSpecName: "config-data") pod "865db8fd-ea83-4b7a-a24a-546284b3f835" (UID: "865db8fd-ea83-4b7a-a24a-546284b3f835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.016357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "865db8fd-ea83-4b7a-a24a-546284b3f835" (UID: "865db8fd-ea83-4b7a-a24a-546284b3f835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.066526 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.066576 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865db8fd-ea83-4b7a-a24a-546284b3f835-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.066594 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mftgd\" (UniqueName: \"kubernetes.io/projected/865db8fd-ea83-4b7a-a24a-546284b3f835-kube-api-access-mftgd\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:03 crc kubenswrapper[4830]: W1203 22:29:03.143290 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbe2500c_976b_4cd8_983a_e7b1767d03fa.slice/crio-6eb534ffef9f1e9c901ef0ef70d7fd2b195bf43561d01f6900bc1c5141588cb5 WatchSource:0}: Error finding container 6eb534ffef9f1e9c901ef0ef70d7fd2b195bf43561d01f6900bc1c5141588cb5: Status 404 returned error can't find the container with id 6eb534ffef9f1e9c901ef0ef70d7fd2b195bf43561d01f6900bc1c5141588cb5 Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.145582 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.163605 4830 generic.go:334] "Generic (PLEG): container finished" podID="865db8fd-ea83-4b7a-a24a-546284b3f835" containerID="b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda" exitCode=0 Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.163703 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"865db8fd-ea83-4b7a-a24a-546284b3f835","Type":"ContainerDied","Data":"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.163702 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.163734 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"865db8fd-ea83-4b7a-a24a-546284b3f835","Type":"ContainerDied","Data":"8de33e8876e793f4a83c3c619a7bc3f6e5a9f2bb0ae396da4cbb1f6ef0d59d03"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.163755 4830 scope.go:117] "RemoveContainer" containerID="b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.169223 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerStarted","Data":"6eb534ffef9f1e9c901ef0ef70d7fd2b195bf43561d01f6900bc1c5141588cb5"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.171128 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerStarted","Data":"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.171151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerStarted","Data":"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.171161 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerStarted","Data":"268aa1086902b3172bdf8902e8478ed49719d91cba650088c9a24b06d1ca97de"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.174001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1eea67ad-cea1-4acf-a451-76a490e27693","Type":"ContainerStarted","Data":"f292fb3ac22303b6ab0573b82918904001d99ef708b62965daeff292c6f88c93"} Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.174815 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.197910 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.19789303 podStartE2EDuration="2.19789303s" podCreationTimestamp="2025-12-03 22:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:03.190963733 +0000 UTC m=+1432.187425102" watchObservedRunningTime="2025-12-03 22:29:03.19789303 +0000 UTC m=+1432.194354379" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.198428 4830 scope.go:117] "RemoveContainer" containerID="b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda" Dec 03 22:29:03 crc kubenswrapper[4830]: E1203 22:29:03.199760 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda\": container with ID starting with b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda not found: ID does not exist" containerID="b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.199881 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda"} err="failed to get container status \"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda\": rpc error: code = NotFound desc = could not find container \"b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda\": container with ID starting with b214689810bbc67c5d21ac3a49e820a5f599342e011178ec0a86e0b15ec28eda not found: ID does not exist" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.232551 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.247955 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.259180 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:03 crc kubenswrapper[4830]: E1203 22:29:03.259896 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865db8fd-ea83-4b7a-a24a-546284b3f835" containerName="nova-scheduler-scheduler" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.259971 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="865db8fd-ea83-4b7a-a24a-546284b3f835" containerName="nova-scheduler-scheduler" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.260257 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="865db8fd-ea83-4b7a-a24a-546284b3f835" containerName="nova-scheduler-scheduler" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.260530 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.260491063 podStartE2EDuration="2.260491063s" podCreationTimestamp="2025-12-03 22:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:03.229372302 +0000 UTC m=+1432.225833651" watchObservedRunningTime="2025-12-03 22:29:03.260491063 +0000 UTC m=+1432.256952412" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.261248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.263082 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.270095 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.270145 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.270234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwpr\" (UniqueName: \"kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.289156 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.360668 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865db8fd-ea83-4b7a-a24a-546284b3f835" path="/var/lib/kubelet/pods/865db8fd-ea83-4b7a-a24a-546284b3f835/volumes" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.361983 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61d0377-2105-49a0-aeb0-5dfd2b07fdcc" path="/var/lib/kubelet/pods/f61d0377-2105-49a0-aeb0-5dfd2b07fdcc/volumes" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.372586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwpr\" (UniqueName: \"kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.372727 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.372753 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.380978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.381078 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.397978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwpr\" (UniqueName: \"kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr\") pod \"nova-scheduler-0\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.408741 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.475127 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:29:03 crc kubenswrapper[4830]: I1203 22:29:03.598444 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.003172 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.185547 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b264ecc9-1aeb-4e6e-a342-73b4675c492a","Type":"ContainerStarted","Data":"2f2bf73bf72078c7dcf91e1d8af9a73fdde6792e6a59131bbe3b8a1f2e412099"} Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.187141 4830 generic.go:334] "Generic (PLEG): container finished" podID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerID="068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de" exitCode=0 Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.187192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerDied","Data":"068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de"} Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.191546 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="dnsmasq-dns" containerID="cri-o://a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9" gracePeriod=10 Dec 03 22:29:04 crc kubenswrapper[4830]: I1203 22:29:04.921787 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.031773 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.032185 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.032295 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.032393 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6lrv\" (UniqueName: \"kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.032431 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.032460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc\") pod \"33def065-6580-4b21-b0e1-ebdf0897c741\" (UID: \"33def065-6580-4b21-b0e1-ebdf0897c741\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.046760 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv" (OuterVolumeSpecName: "kube-api-access-d6lrv") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "kube-api-access-d6lrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.105459 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.116940 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.126046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config" (OuterVolumeSpecName: "config") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.129339 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.134839 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.134865 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.134878 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6lrv\" (UniqueName: \"kubernetes.io/projected/33def065-6580-4b21-b0e1-ebdf0897c741-kube-api-access-d6lrv\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.134887 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.142147 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.148138 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33def065-6580-4b21-b0e1-ebdf0897c741" (UID: "33def065-6580-4b21-b0e1-ebdf0897c741"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.207502 4830 generic.go:334] "Generic (PLEG): container finished" podID="824ae862-2f34-4c76-b9b4-c275a058c466" containerID="193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005" exitCode=0 Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.207768 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.208887 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerDied","Data":"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.209192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"824ae862-2f34-4c76-b9b4-c275a058c466","Type":"ContainerDied","Data":"353d87ff666f3a8f2416e3af3132b3af260b67c52bcfcc7ff523dfd598a1a035"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.209225 4830 scope.go:117] "RemoveContainer" containerID="193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.216088 4830 generic.go:334] "Generic (PLEG): container finished" podID="33def065-6580-4b21-b0e1-ebdf0897c741" containerID="a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9" exitCode=0 Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.216175 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" event={"ID":"33def065-6580-4b21-b0e1-ebdf0897c741","Type":"ContainerDied","Data":"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.216202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" event={"ID":"33def065-6580-4b21-b0e1-ebdf0897c741","Type":"ContainerDied","Data":"04d257015d872c4947c0b5ad9e1ed6747d35a993f0e6d2281c0ad2c884962c7c"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.216229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-lrtz2" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.221727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b264ecc9-1aeb-4e6e-a342-73b4675c492a","Type":"ContainerStarted","Data":"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.229381 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerStarted","Data":"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df"} Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.238176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngjzp\" (UniqueName: \"kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp\") pod \"824ae862-2f34-4c76-b9b4-c275a058c466\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.238342 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle\") pod \"824ae862-2f34-4c76-b9b4-c275a058c466\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.238404 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data\") pod \"824ae862-2f34-4c76-b9b4-c275a058c466\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.238465 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs\") pod \"824ae862-2f34-4c76-b9b4-c275a058c466\" (UID: \"824ae862-2f34-4c76-b9b4-c275a058c466\") " Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.245607 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs" (OuterVolumeSpecName: "logs") pod "824ae862-2f34-4c76-b9b4-c275a058c466" (UID: "824ae862-2f34-4c76-b9b4-c275a058c466"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.248643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp" (OuterVolumeSpecName: "kube-api-access-ngjzp") pod "824ae862-2f34-4c76-b9b4-c275a058c466" (UID: "824ae862-2f34-4c76-b9b4-c275a058c466"). InnerVolumeSpecName "kube-api-access-ngjzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.248898 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.248932 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824ae862-2f34-4c76-b9b4-c275a058c466-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.248944 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngjzp\" (UniqueName: \"kubernetes.io/projected/824ae862-2f34-4c76-b9b4-c275a058c466-kube-api-access-ngjzp\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.248955 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33def065-6580-4b21-b0e1-ebdf0897c741-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.261896 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.261877211 podStartE2EDuration="2.261877211s" podCreationTimestamp="2025-12-03 22:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:05.243996508 +0000 UTC m=+1434.240457857" watchObservedRunningTime="2025-12-03 22:29:05.261877211 +0000 UTC m=+1434.258338560" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.264729 4830 scope.go:117] "RemoveContainer" containerID="9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.278261 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data" (OuterVolumeSpecName: "config-data") pod "824ae862-2f34-4c76-b9b4-c275a058c466" (UID: "824ae862-2f34-4c76-b9b4-c275a058c466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.290694 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "824ae862-2f34-4c76-b9b4-c275a058c466" (UID: "824ae862-2f34-4c76-b9b4-c275a058c466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.291403 4830 scope.go:117] "RemoveContainer" containerID="193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.291962 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005\": container with ID starting with 193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005 not found: ID does not exist" containerID="193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.291997 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005"} err="failed to get container status \"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005\": rpc error: code = NotFound desc = could not find container \"193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005\": container with ID starting with 193de327bd6141e0ee785703a23748930bfb55450c2c8ac79e1186bd21f8e005 not found: ID does not exist" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.292027 4830 scope.go:117] "RemoveContainer" containerID="9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.292406 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98\": container with ID starting with 9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98 not found: ID does not exist" containerID="9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.292934 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98"} err="failed to get container status \"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98\": rpc error: code = NotFound desc = could not find container \"9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98\": container with ID starting with 9b6fbb65317f5a1b2b1e5157377ec0ab881dd331cabbfb264947626d7d353f98 not found: ID does not exist" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.292973 4830 scope.go:117] "RemoveContainer" containerID="a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.313091 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.319580 4830 scope.go:117] "RemoveContainer" containerID="bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.328398 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-lrtz2"] Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.350081 4830 scope.go:117] "RemoveContainer" containerID="a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.350707 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" path="/var/lib/kubelet/pods/33def065-6580-4b21-b0e1-ebdf0897c741/volumes" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.350885 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.350913 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824ae862-2f34-4c76-b9b4-c275a058c466-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.354084 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9\": container with ID starting with a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9 not found: ID does not exist" containerID="a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.354141 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9"} err="failed to get container status \"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9\": rpc error: code = NotFound desc = could not find container \"a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9\": container with ID starting with a2990916fa83fd25c1f05f30550b783b3ccb5d0f873322891b390b2f26e315e9 not found: ID does not exist" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.354171 4830 scope.go:117] "RemoveContainer" containerID="bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.354671 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1\": container with ID starting with bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1 not found: ID does not exist" containerID="bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.354706 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1"} err="failed to get container status \"bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1\": rpc error: code = NotFound desc = could not find container \"bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1\": container with ID starting with bf57fefff9e375e2f71a79e460f8a5512129d908ae453a7d5279f22cd5a4feb1 not found: ID does not exist" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.532796 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.543519 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.559105 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.563636 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-api" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.563665 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-api" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.563719 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="init" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.563727 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="init" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.563780 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-log" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.563798 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-log" Dec 03 22:29:05 crc kubenswrapper[4830]: E1203 22:29:05.563818 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="dnsmasq-dns" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.563824 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="dnsmasq-dns" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.564708 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="33def065-6580-4b21-b0e1-ebdf0897c741" containerName="dnsmasq-dns" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.564734 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-api" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.564757 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" containerName="nova-api-log" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.568185 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.573246 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.597935 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.656010 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xg6g\" (UniqueName: \"kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.656098 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.656133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.656203 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.757911 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xg6g\" (UniqueName: \"kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.758300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.758331 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.758398 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.759600 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.765021 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.772483 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.778946 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xg6g\" (UniqueName: \"kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g\") pod \"nova-api-0\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " pod="openstack/nova-api-0" Dec 03 22:29:05 crc kubenswrapper[4830]: I1203 22:29:05.891447 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:06 crc kubenswrapper[4830]: I1203 22:29:06.240229 4830 generic.go:334] "Generic (PLEG): container finished" podID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerID="f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df" exitCode=0 Dec 03 22:29:06 crc kubenswrapper[4830]: I1203 22:29:06.240371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerDied","Data":"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df"} Dec 03 22:29:06 crc kubenswrapper[4830]: I1203 22:29:06.356623 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:06 crc kubenswrapper[4830]: W1203 22:29:06.360417 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1567b5ee_3b86_4957_8cfa_536ac66a34b5.slice/crio-3c1955fea67fa06618c032eb8a2990e71632775a7bfeba7e2c1a189a3941b0dc WatchSource:0}: Error finding container 3c1955fea67fa06618c032eb8a2990e71632775a7bfeba7e2c1a189a3941b0dc: Status 404 returned error can't find the container with id 3c1955fea67fa06618c032eb8a2990e71632775a7bfeba7e2c1a189a3941b0dc Dec 03 22:29:06 crc kubenswrapper[4830]: I1203 22:29:06.901708 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:29:06 crc kubenswrapper[4830]: I1203 22:29:06.902710 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:29:07 crc kubenswrapper[4830]: I1203 22:29:07.263103 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerStarted","Data":"5d0bcdaafd5c742338eb38d225f2e28154a46f570f5199770ee9b45b3d954123"} Dec 03 22:29:07 crc kubenswrapper[4830]: I1203 22:29:07.263379 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerStarted","Data":"8002261c5772708d9415dca3b89607ab0876a985914166943e3869dc4256871a"} Dec 03 22:29:07 crc kubenswrapper[4830]: I1203 22:29:07.263392 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerStarted","Data":"3c1955fea67fa06618c032eb8a2990e71632775a7bfeba7e2c1a189a3941b0dc"} Dec 03 22:29:07 crc kubenswrapper[4830]: I1203 22:29:07.280458 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.280441075 podStartE2EDuration="2.280441075s" podCreationTimestamp="2025-12-03 22:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:07.276183789 +0000 UTC m=+1436.272645158" watchObservedRunningTime="2025-12-03 22:29:07.280441075 +0000 UTC m=+1436.276902424" Dec 03 22:29:07 crc kubenswrapper[4830]: I1203 22:29:07.347867 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824ae862-2f34-4c76-b9b4-c275a058c466" path="/var/lib/kubelet/pods/824ae862-2f34-4c76-b9b4-c275a058c466/volumes" Dec 03 22:29:08 crc kubenswrapper[4830]: I1203 22:29:08.274942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerStarted","Data":"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537"} Dec 03 22:29:08 crc kubenswrapper[4830]: I1203 22:29:08.298637 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8qgr" podStartSLOduration=3.440175624 podStartE2EDuration="6.298617801s" podCreationTimestamp="2025-12-03 22:29:02 +0000 UTC" firstStartedPulling="2025-12-03 22:29:04.19066079 +0000 UTC m=+1433.187122139" lastFinishedPulling="2025-12-03 22:29:07.049102957 +0000 UTC m=+1436.045564316" observedRunningTime="2025-12-03 22:29:08.296949067 +0000 UTC m=+1437.293410436" watchObservedRunningTime="2025-12-03 22:29:08.298617801 +0000 UTC m=+1437.295079150" Dec 03 22:29:08 crc kubenswrapper[4830]: I1203 22:29:08.598625 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:29:11 crc kubenswrapper[4830]: I1203 22:29:11.492097 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 22:29:11 crc kubenswrapper[4830]: I1203 22:29:11.901825 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:29:11 crc kubenswrapper[4830]: I1203 22:29:11.903394 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:29:12 crc kubenswrapper[4830]: I1203 22:29:12.573635 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:12 crc kubenswrapper[4830]: I1203 22:29:12.574254 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:12 crc kubenswrapper[4830]: I1203 22:29:12.918713 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:12 crc kubenswrapper[4830]: I1203 22:29:12.918732 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:13 crc kubenswrapper[4830]: I1203 22:29:13.599531 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:29:13 crc kubenswrapper[4830]: I1203 22:29:13.628627 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8qgr" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="registry-server" probeResult="failure" output=< Dec 03 22:29:13 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 22:29:13 crc kubenswrapper[4830]: > Dec 03 22:29:13 crc kubenswrapper[4830]: I1203 22:29:13.651750 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:29:14 crc kubenswrapper[4830]: I1203 22:29:14.372183 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:29:15 crc kubenswrapper[4830]: I1203 22:29:15.892397 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:29:15 crc kubenswrapper[4830]: I1203 22:29:15.892786 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:29:16 crc kubenswrapper[4830]: I1203 22:29:16.974762 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:16 crc kubenswrapper[4830]: I1203 22:29:16.975147 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:17 crc kubenswrapper[4830]: I1203 22:29:17.996106 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:29:21 crc kubenswrapper[4830]: I1203 22:29:21.759604 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:21 crc kubenswrapper[4830]: I1203 22:29:21.760291 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="443cf1a9-f7ab-413e-bddf-08978b24fc87" containerName="kube-state-metrics" containerID="cri-o://e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0" gracePeriod=30 Dec 03 22:29:21 crc kubenswrapper[4830]: I1203 22:29:21.936409 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:29:21 crc kubenswrapper[4830]: I1203 22:29:21.937421 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:29:21 crc kubenswrapper[4830]: I1203 22:29:21.942207 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.346669 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.426114 4830 generic.go:334] "Generic (PLEG): container finished" podID="443cf1a9-f7ab-413e-bddf-08978b24fc87" containerID="e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0" exitCode=2 Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.427368 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.427594 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"443cf1a9-f7ab-413e-bddf-08978b24fc87","Type":"ContainerDied","Data":"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0"} Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.427738 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"443cf1a9-f7ab-413e-bddf-08978b24fc87","Type":"ContainerDied","Data":"343938ffaf89146bfeb86e45c007ed5938461b7f03a682f76836bd3992e8f657"} Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.427783 4830 scope.go:117] "RemoveContainer" containerID="e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.438683 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.485686 4830 scope.go:117] "RemoveContainer" containerID="e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0" Dec 03 22:29:22 crc kubenswrapper[4830]: E1203 22:29:22.494538 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0\": container with ID starting with e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0 not found: ID does not exist" containerID="e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.494578 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0"} err="failed to get container status \"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0\": rpc error: code = NotFound desc = could not find container \"e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0\": container with ID starting with e793284f4a70f89dcb73b815056723ea18d025d1b360701d69537baf5d073ba0 not found: ID does not exist" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.521384 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td5f4\" (UniqueName: \"kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4\") pod \"443cf1a9-f7ab-413e-bddf-08978b24fc87\" (UID: \"443cf1a9-f7ab-413e-bddf-08978b24fc87\") " Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.530464 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4" (OuterVolumeSpecName: "kube-api-access-td5f4") pod "443cf1a9-f7ab-413e-bddf-08978b24fc87" (UID: "443cf1a9-f7ab-413e-bddf-08978b24fc87"). InnerVolumeSpecName "kube-api-access-td5f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.623326 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td5f4\" (UniqueName: \"kubernetes.io/projected/443cf1a9-f7ab-413e-bddf-08978b24fc87-kube-api-access-td5f4\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.643780 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.700219 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.764670 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.780486 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.790784 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:22 crc kubenswrapper[4830]: E1203 22:29:22.791239 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443cf1a9-f7ab-413e-bddf-08978b24fc87" containerName="kube-state-metrics" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.791257 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="443cf1a9-f7ab-413e-bddf-08978b24fc87" containerName="kube-state-metrics" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.791474 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="443cf1a9-f7ab-413e-bddf-08978b24fc87" containerName="kube-state-metrics" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.792241 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.796116 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.796157 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.806908 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.890054 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.930992 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.931061 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.931907 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9n9\" (UniqueName: \"kubernetes.io/projected/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-api-access-ns9n9\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:22 crc kubenswrapper[4830]: I1203 22:29:22.931995 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.033485 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9n9\" (UniqueName: \"kubernetes.io/projected/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-api-access-ns9n9\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.033603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.033654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.033708 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.038771 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.039196 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.046726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.057792 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9n9\" (UniqueName: \"kubernetes.io/projected/4e04514b-2828-4cd0-9fa7-7d5a970957a0-kube-api-access-ns9n9\") pod \"kube-state-metrics-0\" (UID: \"4e04514b-2828-4cd0-9fa7-7d5a970957a0\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.113097 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.355749 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443cf1a9-f7ab-413e-bddf-08978b24fc87" path="/var/lib/kubelet/pods/443cf1a9-f7ab-413e-bddf-08978b24fc87/volumes" Dec 03 22:29:23 crc kubenswrapper[4830]: W1203 22:29:23.702035 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e04514b_2828_4cd0_9fa7_7d5a970957a0.slice/crio-ac47caf8f08c9f74600425ce224cc5ae84ce0ea7b3d5e44413405577da2b86d0 WatchSource:0}: Error finding container ac47caf8f08c9f74600425ce224cc5ae84ce0ea7b3d5e44413405577da2b86d0: Status 404 returned error can't find the container with id ac47caf8f08c9f74600425ce224cc5ae84ce0ea7b3d5e44413405577da2b86d0 Dec 03 22:29:23 crc kubenswrapper[4830]: I1203 22:29:23.704118 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.337566 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.338271 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="proxy-httpd" containerID="cri-o://fa2a3c97db2b7dd6dc82b68e486ae304d82402699b1d50f726e2937da06eb202" gracePeriod=30 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.338268 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-central-agent" containerID="cri-o://9cefab27e4cf697e9f9cf881316dbe2a23f4c6786296debd89dda073575986c8" gracePeriod=30 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.338333 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="sg-core" containerID="cri-o://5012856d8b265195bd2e96366a5da4fb89584064fe6e0130f7b7184ec84411ca" gracePeriod=30 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.338395 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-notification-agent" containerID="cri-o://00f5771a3a9e81adda147d72dcc505c7c574eac214f135250c7782d1270d0996" gracePeriod=30 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.463366 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e04514b-2828-4cd0-9fa7-7d5a970957a0","Type":"ContainerStarted","Data":"59fdb633b55e85b1c1e3618cbcb6d60c606afdb3992ca2d4f88972d1e3ae846e"} Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.463877 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.463939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e04514b-2828-4cd0-9fa7-7d5a970957a0","Type":"ContainerStarted","Data":"ac47caf8f08c9f74600425ce224cc5ae84ce0ea7b3d5e44413405577da2b86d0"} Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.467387 4830 generic.go:334] "Generic (PLEG): container finished" podID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerID="fa2a3c97db2b7dd6dc82b68e486ae304d82402699b1d50f726e2937da06eb202" exitCode=0 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.467417 4830 generic.go:334] "Generic (PLEG): container finished" podID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerID="5012856d8b265195bd2e96366a5da4fb89584064fe6e0130f7b7184ec84411ca" exitCode=2 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.467462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerDied","Data":"fa2a3c97db2b7dd6dc82b68e486ae304d82402699b1d50f726e2937da06eb202"} Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.467538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerDied","Data":"5012856d8b265195bd2e96366a5da4fb89584064fe6e0130f7b7184ec84411ca"} Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.468191 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8qgr" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="registry-server" containerID="cri-o://d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537" gracePeriod=2 Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.487053 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.131068447 podStartE2EDuration="2.487034804s" podCreationTimestamp="2025-12-03 22:29:22 +0000 UTC" firstStartedPulling="2025-12-03 22:29:23.704218112 +0000 UTC m=+1452.700679471" lastFinishedPulling="2025-12-03 22:29:24.060184479 +0000 UTC m=+1453.056645828" observedRunningTime="2025-12-03 22:29:24.486779227 +0000 UTC m=+1453.483240576" watchObservedRunningTime="2025-12-03 22:29:24.487034804 +0000 UTC m=+1453.483496153" Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.933231 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.980975 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities\") pod \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.981012 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb2d8\" (UniqueName: \"kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8\") pod \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.981048 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content\") pod \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\" (UID: \"bbe2500c-976b-4cd8-983a-e7b1767d03fa\") " Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.981945 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities" (OuterVolumeSpecName: "utilities") pod "bbe2500c-976b-4cd8-983a-e7b1767d03fa" (UID: "bbe2500c-976b-4cd8-983a-e7b1767d03fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:24 crc kubenswrapper[4830]: I1203 22:29:24.986246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8" (OuterVolumeSpecName: "kube-api-access-tb2d8") pod "bbe2500c-976b-4cd8-983a-e7b1767d03fa" (UID: "bbe2500c-976b-4cd8-983a-e7b1767d03fa"). InnerVolumeSpecName "kube-api-access-tb2d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.084532 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.084594 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb2d8\" (UniqueName: \"kubernetes.io/projected/bbe2500c-976b-4cd8-983a-e7b1767d03fa-kube-api-access-tb2d8\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.086892 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbe2500c-976b-4cd8-983a-e7b1767d03fa" (UID: "bbe2500c-976b-4cd8-983a-e7b1767d03fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.186005 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe2500c-976b-4cd8-983a-e7b1767d03fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.480061 4830 generic.go:334] "Generic (PLEG): container finished" podID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerID="9cefab27e4cf697e9f9cf881316dbe2a23f4c6786296debd89dda073575986c8" exitCode=0 Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.480143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerDied","Data":"9cefab27e4cf697e9f9cf881316dbe2a23f4c6786296debd89dda073575986c8"} Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.483528 4830 generic.go:334] "Generic (PLEG): container finished" podID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerID="d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537" exitCode=0 Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.483567 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8qgr" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.483625 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerDied","Data":"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537"} Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.483663 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8qgr" event={"ID":"bbe2500c-976b-4cd8-983a-e7b1767d03fa","Type":"ContainerDied","Data":"6eb534ffef9f1e9c901ef0ef70d7fd2b195bf43561d01f6900bc1c5141588cb5"} Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.483686 4830 scope.go:117] "RemoveContainer" containerID="d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.488009 4830 generic.go:334] "Generic (PLEG): container finished" podID="203323f1-89ed-49c0-b7c0-31273a77c22c" containerID="acede5cebe2ec492a7f5581f438c944410198aec1103c227865f8442b48df2eb" exitCode=137 Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.488826 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"203323f1-89ed-49c0-b7c0-31273a77c22c","Type":"ContainerDied","Data":"acede5cebe2ec492a7f5581f438c944410198aec1103c227865f8442b48df2eb"} Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.488865 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"203323f1-89ed-49c0-b7c0-31273a77c22c","Type":"ContainerDied","Data":"469972217b30fa43c2e4ce71d9558d959ab0023212990c1d848224b6fc364648"} Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.490422 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469972217b30fa43c2e4ce71d9558d959ab0023212990c1d848224b6fc364648" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.508818 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.522234 4830 scope.go:117] "RemoveContainer" containerID="f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.543649 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.553577 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8qgr"] Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.572049 4830 scope.go:117] "RemoveContainer" containerID="068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.597356 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data\") pod \"203323f1-89ed-49c0-b7c0-31273a77c22c\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.597533 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krtnf\" (UniqueName: \"kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf\") pod \"203323f1-89ed-49c0-b7c0-31273a77c22c\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.597810 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle\") pod \"203323f1-89ed-49c0-b7c0-31273a77c22c\" (UID: \"203323f1-89ed-49c0-b7c0-31273a77c22c\") " Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.607777 4830 scope.go:117] "RemoveContainer" containerID="d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.607848 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf" (OuterVolumeSpecName: "kube-api-access-krtnf") pod "203323f1-89ed-49c0-b7c0-31273a77c22c" (UID: "203323f1-89ed-49c0-b7c0-31273a77c22c"). InnerVolumeSpecName "kube-api-access-krtnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:25 crc kubenswrapper[4830]: E1203 22:29:25.608588 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537\": container with ID starting with d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537 not found: ID does not exist" containerID="d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.608636 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537"} err="failed to get container status \"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537\": rpc error: code = NotFound desc = could not find container \"d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537\": container with ID starting with d4e4c98928ffd782293de585841af601f0963462ea2979597944b2240b840537 not found: ID does not exist" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.608669 4830 scope.go:117] "RemoveContainer" containerID="f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df" Dec 03 22:29:25 crc kubenswrapper[4830]: E1203 22:29:25.609122 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df\": container with ID starting with f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df not found: ID does not exist" containerID="f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.609153 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df"} err="failed to get container status \"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df\": rpc error: code = NotFound desc = could not find container \"f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df\": container with ID starting with f10a2ea3e8ea61e10a70a8e7173516e281de0b3e30a98b6c5d6c2fcc18db11df not found: ID does not exist" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.609171 4830 scope.go:117] "RemoveContainer" containerID="068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de" Dec 03 22:29:25 crc kubenswrapper[4830]: E1203 22:29:25.609630 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de\": container with ID starting with 068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de not found: ID does not exist" containerID="068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.609687 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de"} err="failed to get container status \"068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de\": rpc error: code = NotFound desc = could not find container \"068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de\": container with ID starting with 068ed922d27242f5f545a8a0793330910fd6e25a070e6f7abdf1929b2e14d5de not found: ID does not exist" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.629104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data" (OuterVolumeSpecName: "config-data") pod "203323f1-89ed-49c0-b7c0-31273a77c22c" (UID: "203323f1-89ed-49c0-b7c0-31273a77c22c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.631013 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "203323f1-89ed-49c0-b7c0-31273a77c22c" (UID: "203323f1-89ed-49c0-b7c0-31273a77c22c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.700884 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krtnf\" (UniqueName: \"kubernetes.io/projected/203323f1-89ed-49c0-b7c0-31273a77c22c-kube-api-access-krtnf\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.700917 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.700926 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/203323f1-89ed-49c0-b7c0-31273a77c22c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.895545 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.897821 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.897977 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:29:25 crc kubenswrapper[4830]: I1203 22:29:25.906162 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.498557 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.499891 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.503438 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.559091 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.569054 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.581153 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:29:26 crc kubenswrapper[4830]: E1203 22:29:26.581698 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="registry-server" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.581722 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="registry-server" Dec 03 22:29:26 crc kubenswrapper[4830]: E1203 22:29:26.581757 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="extract-content" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.581765 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="extract-content" Dec 03 22:29:26 crc kubenswrapper[4830]: E1203 22:29:26.581778 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203323f1-89ed-49c0-b7c0-31273a77c22c" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.581785 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="203323f1-89ed-49c0-b7c0-31273a77c22c" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:29:26 crc kubenswrapper[4830]: E1203 22:29:26.581807 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="extract-utilities" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.581814 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="extract-utilities" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.596636 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="203323f1-89ed-49c0-b7c0-31273a77c22c" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.596724 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" containerName="registry-server" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.597843 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.597980 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.606296 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.606449 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.606330 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.624496 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.624613 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.624636 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.624747 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.624959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8j4h\" (UniqueName: \"kubernetes.io/projected/33e1d4ce-81f4-4a02-8ac5-384686943b19-kube-api-access-l8j4h\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.681354 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.681399 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.681436 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.682200 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.682251 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743" gracePeriod=600 Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.727450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.727943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.728164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.728354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8j4h\" (UniqueName: \"kubernetes.io/projected/33e1d4ce-81f4-4a02-8ac5-384686943b19-kube-api-access-l8j4h\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.728730 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.730556 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.733263 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.733480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.734162 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.734164 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.744042 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.756241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e1d4ce-81f4-4a02-8ac5-384686943b19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.761411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8j4h\" (UniqueName: \"kubernetes.io/projected/33e1d4ce-81f4-4a02-8ac5-384686943b19-kube-api-access-l8j4h\") pod \"nova-cell1-novncproxy-0\" (UID: \"33e1d4ce-81f4-4a02-8ac5-384686943b19\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.924887 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.940999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.941053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.941088 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.941145 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.941242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm28x\" (UniqueName: \"kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:26 crc kubenswrapper[4830]: I1203 22:29:26.941287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046555 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm28x\" (UniqueName: \"kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046736 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.046762 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.047754 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.047832 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.049178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.050631 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.052803 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.070857 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm28x\" (UniqueName: \"kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x\") pod \"dnsmasq-dns-5fd9b586ff-j9jwq\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.194832 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.357958 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203323f1-89ed-49c0-b7c0-31273a77c22c" path="/var/lib/kubelet/pods/203323f1-89ed-49c0-b7c0-31273a77c22c/volumes" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.358783 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe2500c-976b-4cd8-983a-e7b1767d03fa" path="/var/lib/kubelet/pods/bbe2500c-976b-4cd8-983a-e7b1767d03fa/volumes" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.427612 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.525118 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33e1d4ce-81f4-4a02-8ac5-384686943b19","Type":"ContainerStarted","Data":"d9fa78444ac9815f1eb7b30fb07776c7e0e1071480bd799327719a249a32c9c9"} Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.555229 4830 generic.go:334] "Generic (PLEG): container finished" podID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerID="00f5771a3a9e81adda147d72dcc505c7c574eac214f135250c7782d1270d0996" exitCode=0 Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.555287 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerDied","Data":"00f5771a3a9e81adda147d72dcc505c7c574eac214f135250c7782d1270d0996"} Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.566984 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743" exitCode=0 Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.568092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743"} Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.568130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645"} Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.568148 4830 scope.go:117] "RemoveContainer" containerID="b6699c360d5a693c67050c7e96ce75da1159981ce57e7b4db9870c6676af8453" Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.762837 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:29:27 crc kubenswrapper[4830]: W1203 22:29:27.775171 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720ca8ef_d526_423c_a334_4e6a771a8e6e.slice/crio-e2cc6703772ce875e8914d9cfb12f6dd49371afec3b97ec7af0a5b9d8e7f782d WatchSource:0}: Error finding container e2cc6703772ce875e8914d9cfb12f6dd49371afec3b97ec7af0a5b9d8e7f782d: Status 404 returned error can't find the container with id e2cc6703772ce875e8914d9cfb12f6dd49371afec3b97ec7af0a5b9d8e7f782d Dec 03 22:29:27 crc kubenswrapper[4830]: I1203 22:29:27.905347 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080473 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080588 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080617 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6s4m\" (UniqueName: \"kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080687 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080737 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.080893 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd\") pod \"c69e374c-c6f7-4d9e-a302-45844b3ca243\" (UID: \"c69e374c-c6f7-4d9e-a302-45844b3ca243\") " Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.082044 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.081980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.090957 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m" (OuterVolumeSpecName: "kube-api-access-s6s4m") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "kube-api-access-s6s4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.110003 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts" (OuterVolumeSpecName: "scripts") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.163285 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.195059 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.195084 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69e374c-c6f7-4d9e-a302-45844b3ca243-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.195092 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.195103 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6s4m\" (UniqueName: \"kubernetes.io/projected/c69e374c-c6f7-4d9e-a302-45844b3ca243-kube-api-access-s6s4m\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.195112 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.269668 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.295712 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data" (OuterVolumeSpecName: "config-data") pod "c69e374c-c6f7-4d9e-a302-45844b3ca243" (UID: "c69e374c-c6f7-4d9e-a302-45844b3ca243"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.297600 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.297634 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69e374c-c6f7-4d9e-a302-45844b3ca243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.581808 4830 generic.go:334] "Generic (PLEG): container finished" podID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerID="ada712e13a46cd0510bb004aa90abfd7c51b104f464386e9032c4533cb099365" exitCode=0 Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.581885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" event={"ID":"720ca8ef-d526-423c-a334-4e6a771a8e6e","Type":"ContainerDied","Data":"ada712e13a46cd0510bb004aa90abfd7c51b104f464386e9032c4533cb099365"} Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.581917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" event={"ID":"720ca8ef-d526-423c-a334-4e6a771a8e6e","Type":"ContainerStarted","Data":"e2cc6703772ce875e8914d9cfb12f6dd49371afec3b97ec7af0a5b9d8e7f782d"} Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.583624 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"33e1d4ce-81f4-4a02-8ac5-384686943b19","Type":"ContainerStarted","Data":"69bf1f12a7923dce85df031c7f266e307b4f0a994874401847887f745c5d23d5"} Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.589257 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69e374c-c6f7-4d9e-a302-45844b3ca243","Type":"ContainerDied","Data":"1958a1e866d181af27c17ce431fcccc352ae7404988a7e18d39e7b2c25bff573"} Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.589308 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.589312 4830 scope.go:117] "RemoveContainer" containerID="fa2a3c97db2b7dd6dc82b68e486ae304d82402699b1d50f726e2937da06eb202" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.661759 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.66173498 podStartE2EDuration="2.66173498s" podCreationTimestamp="2025-12-03 22:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:28.628136361 +0000 UTC m=+1457.624597720" watchObservedRunningTime="2025-12-03 22:29:28.66173498 +0000 UTC m=+1457.658196329" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.663102 4830 scope.go:117] "RemoveContainer" containerID="5012856d8b265195bd2e96366a5da4fb89584064fe6e0130f7b7184ec84411ca" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.676885 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.686522 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.697278 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:28 crc kubenswrapper[4830]: E1203 22:29:28.697767 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="proxy-httpd" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.697780 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="proxy-httpd" Dec 03 22:29:28 crc kubenswrapper[4830]: E1203 22:29:28.697816 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-notification-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.697822 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-notification-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: E1203 22:29:28.697834 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="sg-core" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.697839 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="sg-core" Dec 03 22:29:28 crc kubenswrapper[4830]: E1203 22:29:28.697855 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-central-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.697861 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-central-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.698062 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-notification-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.698076 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="sg-core" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.698087 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="ceilometer-central-agent" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.698110 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" containerName="proxy-httpd" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.699994 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.707071 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.707313 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.707463 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709236 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.709475 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.710143 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvmh\" (UniqueName: \"kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.717474 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.726397 4830 scope.go:117] "RemoveContainer" containerID="00f5771a3a9e81adda147d72dcc505c7c574eac214f135250c7782d1270d0996" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.780704 4830 scope.go:117] "RemoveContainer" containerID="9cefab27e4cf697e9f9cf881316dbe2a23f4c6786296debd89dda073575986c8" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811493 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvmh\" (UniqueName: \"kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811645 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811694 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.811717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.812234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.813149 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.816794 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.817063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.817425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.820785 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.824246 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:28 crc kubenswrapper[4830]: I1203 22:29:28.834119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvmh\" (UniqueName: \"kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh\") pod \"ceilometer-0\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " pod="openstack/ceilometer-0" Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.041009 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.286831 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.350854 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69e374c-c6f7-4d9e-a302-45844b3ca243" path="/var/lib/kubelet/pods/c69e374c-c6f7-4d9e-a302-45844b3ca243/volumes" Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.516325 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.606424 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.608090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" event={"ID":"720ca8ef-d526-423c-a334-4e6a771a8e6e","Type":"ContainerStarted","Data":"1e3d5f5bfde74bbd04a9649be99a7dc4d5279ba86303e292f6ac3f7ec8a2517c"} Dec 03 22:29:29 crc kubenswrapper[4830]: W1203 22:29:29.608265 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod029f840e_def3_45b4_a109_6769ec5f64db.slice/crio-7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a WatchSource:0}: Error finding container 7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a: Status 404 returned error can't find the container with id 7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.610349 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-log" containerID="cri-o://8002261c5772708d9415dca3b89607ab0876a985914166943e3869dc4256871a" gracePeriod=30 Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.610574 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-api" containerID="cri-o://5d0bcdaafd5c742338eb38d225f2e28154a46f570f5199770ee9b45b3d954123" gracePeriod=30 Dec 03 22:29:29 crc kubenswrapper[4830]: I1203 22:29:29.630771 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" podStartSLOduration=3.630751767 podStartE2EDuration="3.630751767s" podCreationTimestamp="2025-12-03 22:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:29.630019968 +0000 UTC m=+1458.626481317" watchObservedRunningTime="2025-12-03 22:29:29.630751767 +0000 UTC m=+1458.627213116" Dec 03 22:29:30 crc kubenswrapper[4830]: I1203 22:29:30.639046 4830 generic.go:334] "Generic (PLEG): container finished" podID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerID="8002261c5772708d9415dca3b89607ab0876a985914166943e3869dc4256871a" exitCode=143 Dec 03 22:29:30 crc kubenswrapper[4830]: I1203 22:29:30.639102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerDied","Data":"8002261c5772708d9415dca3b89607ab0876a985914166943e3869dc4256871a"} Dec 03 22:29:30 crc kubenswrapper[4830]: I1203 22:29:30.641344 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerStarted","Data":"7dbf4adab0384eff2f5dfd5372df3919911593d045a4634cf49dfd6b43a83dfa"} Dec 03 22:29:30 crc kubenswrapper[4830]: I1203 22:29:30.641371 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerStarted","Data":"7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a"} Dec 03 22:29:30 crc kubenswrapper[4830]: I1203 22:29:30.641562 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:31 crc kubenswrapper[4830]: I1203 22:29:31.651993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerStarted","Data":"2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0"} Dec 03 22:29:31 crc kubenswrapper[4830]: I1203 22:29:31.926630 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:33 crc kubenswrapper[4830]: I1203 22:29:33.122665 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 22:29:33 crc kubenswrapper[4830]: I1203 22:29:33.674789 4830 generic.go:334] "Generic (PLEG): container finished" podID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerID="5d0bcdaafd5c742338eb38d225f2e28154a46f570f5199770ee9b45b3d954123" exitCode=0 Dec 03 22:29:33 crc kubenswrapper[4830]: I1203 22:29:33.674884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerDied","Data":"5d0bcdaafd5c742338eb38d225f2e28154a46f570f5199770ee9b45b3d954123"} Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.312875 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.331919 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle\") pod \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.332035 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs\") pod \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.332063 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data\") pod \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.333251 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs" (OuterVolumeSpecName: "logs") pod "1567b5ee-3b86-4957-8cfa-536ac66a34b5" (UID: "1567b5ee-3b86-4957-8cfa-536ac66a34b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.409224 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data" (OuterVolumeSpecName: "config-data") pod "1567b5ee-3b86-4957-8cfa-536ac66a34b5" (UID: "1567b5ee-3b86-4957-8cfa-536ac66a34b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.424834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1567b5ee-3b86-4957-8cfa-536ac66a34b5" (UID: "1567b5ee-3b86-4957-8cfa-536ac66a34b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.433696 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xg6g\" (UniqueName: \"kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g\") pod \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\" (UID: \"1567b5ee-3b86-4957-8cfa-536ac66a34b5\") " Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.434350 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1567b5ee-3b86-4957-8cfa-536ac66a34b5-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.434481 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.434577 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1567b5ee-3b86-4957-8cfa-536ac66a34b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.449650 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g" (OuterVolumeSpecName: "kube-api-access-5xg6g") pod "1567b5ee-3b86-4957-8cfa-536ac66a34b5" (UID: "1567b5ee-3b86-4957-8cfa-536ac66a34b5"). InnerVolumeSpecName "kube-api-access-5xg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.537304 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xg6g\" (UniqueName: \"kubernetes.io/projected/1567b5ee-3b86-4957-8cfa-536ac66a34b5-kube-api-access-5xg6g\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.687704 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1567b5ee-3b86-4957-8cfa-536ac66a34b5","Type":"ContainerDied","Data":"3c1955fea67fa06618c032eb8a2990e71632775a7bfeba7e2c1a189a3941b0dc"} Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.688170 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.690936 4830 scope.go:117] "RemoveContainer" containerID="5d0bcdaafd5c742338eb38d225f2e28154a46f570f5199770ee9b45b3d954123" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.690968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerStarted","Data":"009586eddc93b97e8df96e663be669b2bea03269ac57f3b16b1d7848175b1c1d"} Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.734914 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.746928 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.750714 4830 scope.go:117] "RemoveContainer" containerID="8002261c5772708d9415dca3b89607ab0876a985914166943e3869dc4256871a" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.793015 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:34 crc kubenswrapper[4830]: E1203 22:29:34.793735 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-log" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.793828 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-log" Dec 03 22:29:34 crc kubenswrapper[4830]: E1203 22:29:34.793965 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-api" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.794030 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-api" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.794343 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-log" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.794443 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" containerName="nova-api-api" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.795876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.802609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.803022 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.803088 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.813452 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.843884 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.844181 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.844232 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.844464 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.844626 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.844743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpg9s\" (UniqueName: \"kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.950767 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.951135 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpg9s\" (UniqueName: \"kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.951299 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.951495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.951659 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.951840 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.955754 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.958111 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.958903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.959668 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.961208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:34 crc kubenswrapper[4830]: I1203 22:29:34.971174 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpg9s\" (UniqueName: \"kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s\") pod \"nova-api-0\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " pod="openstack/nova-api-0" Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.174952 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.348950 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1567b5ee-3b86-4957-8cfa-536ac66a34b5" path="/var/lib/kubelet/pods/1567b5ee-3b86-4957-8cfa-536ac66a34b5/volumes" Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.680833 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.714237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerStarted","Data":"9ed2b8a980b371638c2496213d6a5f2b6faade6e2cd927aa1e9578cd86308511"} Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.714651 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-central-agent" containerID="cri-o://7dbf4adab0384eff2f5dfd5372df3919911593d045a4634cf49dfd6b43a83dfa" gracePeriod=30 Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.715314 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.715997 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="proxy-httpd" containerID="cri-o://9ed2b8a980b371638c2496213d6a5f2b6faade6e2cd927aa1e9578cd86308511" gracePeriod=30 Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.716180 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="sg-core" containerID="cri-o://009586eddc93b97e8df96e663be669b2bea03269ac57f3b16b1d7848175b1c1d" gracePeriod=30 Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.716357 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-notification-agent" containerID="cri-o://2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0" gracePeriod=30 Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.721344 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerStarted","Data":"b48e2898340efefaf29924468662da98ad805b818086a49bbf4a80bf7c01a7e9"} Dec 03 22:29:35 crc kubenswrapper[4830]: I1203 22:29:35.752693 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.296732998 podStartE2EDuration="7.752670337s" podCreationTimestamp="2025-12-03 22:29:28 +0000 UTC" firstStartedPulling="2025-12-03 22:29:29.612308148 +0000 UTC m=+1458.608769497" lastFinishedPulling="2025-12-03 22:29:35.068245487 +0000 UTC m=+1464.064706836" observedRunningTime="2025-12-03 22:29:35.738434542 +0000 UTC m=+1464.734895891" watchObservedRunningTime="2025-12-03 22:29:35.752670337 +0000 UTC m=+1464.749131676" Dec 03 22:29:36 crc kubenswrapper[4830]: E1203 22:29:36.127951 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod029f840e_def3_45b4_a109_6769ec5f64db.slice/crio-conmon-2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.735792 4830 generic.go:334] "Generic (PLEG): container finished" podID="029f840e-def3-45b4-a109-6769ec5f64db" containerID="009586eddc93b97e8df96e663be669b2bea03269ac57f3b16b1d7848175b1c1d" exitCode=2 Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.736056 4830 generic.go:334] "Generic (PLEG): container finished" podID="029f840e-def3-45b4-a109-6769ec5f64db" containerID="2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0" exitCode=0 Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.735936 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerDied","Data":"009586eddc93b97e8df96e663be669b2bea03269ac57f3b16b1d7848175b1c1d"} Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.736127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerDied","Data":"2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0"} Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.738520 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerStarted","Data":"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d"} Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.738551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerStarted","Data":"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08"} Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.780922 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.780893366 podStartE2EDuration="2.780893366s" podCreationTimestamp="2025-12-03 22:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:36.762202411 +0000 UTC m=+1465.758663770" watchObservedRunningTime="2025-12-03 22:29:36.780893366 +0000 UTC m=+1465.777354745" Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.926867 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:36 crc kubenswrapper[4830]: I1203 22:29:36.951628 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.197473 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.279292 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.279674 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="dnsmasq-dns" containerID="cri-o://cc5bf7687dfb8cc365f2b312194cd250fa43e7f499bce34a1b981d42d8031039" gracePeriod=10 Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.749378 4830 generic.go:334] "Generic (PLEG): container finished" podID="029f840e-def3-45b4-a109-6769ec5f64db" containerID="7dbf4adab0384eff2f5dfd5372df3919911593d045a4634cf49dfd6b43a83dfa" exitCode=0 Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.749830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerDied","Data":"7dbf4adab0384eff2f5dfd5372df3919911593d045a4634cf49dfd6b43a83dfa"} Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.752795 4830 generic.go:334] "Generic (PLEG): container finished" podID="58d17afd-4fea-430f-951a-98d40b505b9d" containerID="cc5bf7687dfb8cc365f2b312194cd250fa43e7f499bce34a1b981d42d8031039" exitCode=0 Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.752966 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerDied","Data":"cc5bf7687dfb8cc365f2b312194cd250fa43e7f499bce34a1b981d42d8031039"} Dec 03 22:29:37 crc kubenswrapper[4830]: I1203 22:29:37.771544 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.028070 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7zlj6"] Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.030199 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.034981 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.035155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.035751 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwmq\" (UniqueName: \"kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.035817 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.035888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.035941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.040672 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7zlj6"] Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.138243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.138397 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.138493 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdwmq\" (UniqueName: \"kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.138639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.145649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.146098 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.157705 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.163988 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdwmq\" (UniqueName: \"kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq\") pod \"nova-cell1-cell-mapping-7zlj6\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.351109 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:38 crc kubenswrapper[4830]: I1203 22:29:38.407253 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.215:5353: connect: connection refused" Dec 03 22:29:39 crc kubenswrapper[4830]: I1203 22:29:39.773074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" event={"ID":"58d17afd-4fea-430f-951a-98d40b505b9d","Type":"ContainerDied","Data":"8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb"} Dec 03 22:29:39 crc kubenswrapper[4830]: I1203 22:29:39.773625 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a68372349b4b2fa617d38b982527426fc3da47782a914cb8eec5a81d70517fb" Dec 03 22:29:39 crc kubenswrapper[4830]: I1203 22:29:39.789162 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7zlj6"] Dec 03 22:29:39 crc kubenswrapper[4830]: W1203 22:29:39.802726 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc88e63f3_37ab_4eb8_8fe7_c2d875e62bb7.slice/crio-4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c WatchSource:0}: Error finding container 4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c: Status 404 returned error can't find the container with id 4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c Dec 03 22:29:39 crc kubenswrapper[4830]: I1203 22:29:39.996409 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084168 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084210 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084292 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrq2g\" (UniqueName: \"kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084519 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.084623 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config\") pod \"58d17afd-4fea-430f-951a-98d40b505b9d\" (UID: \"58d17afd-4fea-430f-951a-98d40b505b9d\") " Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.089666 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g" (OuterVolumeSpecName: "kube-api-access-xrq2g") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "kube-api-access-xrq2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.155237 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.164159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.166410 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config" (OuterVolumeSpecName: "config") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.181012 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.186311 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.186332 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrq2g\" (UniqueName: \"kubernetes.io/projected/58d17afd-4fea-430f-951a-98d40b505b9d-kube-api-access-xrq2g\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.186345 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.186354 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.186362 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.187048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58d17afd-4fea-430f-951a-98d40b505b9d" (UID: "58d17afd-4fea-430f-951a-98d40b505b9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.287889 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58d17afd-4fea-430f-951a-98d40b505b9d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.787698 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-6bnv5" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.790632 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7zlj6" event={"ID":"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7","Type":"ContainerStarted","Data":"13eaa48ce58652a6ac20ba63776fb0bfcc6ea09b3d81eef56912ec00cf879dd3"} Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.790704 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7zlj6" event={"ID":"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7","Type":"ContainerStarted","Data":"4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c"} Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.819278 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7zlj6" podStartSLOduration=2.819258835 podStartE2EDuration="2.819258835s" podCreationTimestamp="2025-12-03 22:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:40.809394608 +0000 UTC m=+1469.805855977" watchObservedRunningTime="2025-12-03 22:29:40.819258835 +0000 UTC m=+1469.815720184" Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.837882 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:29:40 crc kubenswrapper[4830]: I1203 22:29:40.850977 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-6bnv5"] Dec 03 22:29:41 crc kubenswrapper[4830]: I1203 22:29:41.351386 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" path="/var/lib/kubelet/pods/58d17afd-4fea-430f-951a-98d40b505b9d/volumes" Dec 03 22:29:45 crc kubenswrapper[4830]: I1203 22:29:45.176384 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:29:45 crc kubenswrapper[4830]: I1203 22:29:45.177003 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:29:45 crc kubenswrapper[4830]: I1203 22:29:45.875590 4830 generic.go:334] "Generic (PLEG): container finished" podID="c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" containerID="13eaa48ce58652a6ac20ba63776fb0bfcc6ea09b3d81eef56912ec00cf879dd3" exitCode=0 Dec 03 22:29:45 crc kubenswrapper[4830]: I1203 22:29:45.875640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7zlj6" event={"ID":"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7","Type":"ContainerDied","Data":"13eaa48ce58652a6ac20ba63776fb0bfcc6ea09b3d81eef56912ec00cf879dd3"} Dec 03 22:29:46 crc kubenswrapper[4830]: I1203 22:29:46.189654 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:46 crc kubenswrapper[4830]: I1203 22:29:46.189652 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.371175 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.469524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle\") pod \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.469621 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdwmq\" (UniqueName: \"kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq\") pod \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.469665 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data\") pod \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.469698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts\") pod \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\" (UID: \"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7\") " Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.475247 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts" (OuterVolumeSpecName: "scripts") pod "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" (UID: "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.489069 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq" (OuterVolumeSpecName: "kube-api-access-cdwmq") pod "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" (UID: "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7"). InnerVolumeSpecName "kube-api-access-cdwmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.514494 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data" (OuterVolumeSpecName: "config-data") pod "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" (UID: "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.514666 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" (UID: "c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.572334 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.572387 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdwmq\" (UniqueName: \"kubernetes.io/projected/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-kube-api-access-cdwmq\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.572404 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.572417 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.900666 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7zlj6" event={"ID":"c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7","Type":"ContainerDied","Data":"4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c"} Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.901095 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9fe2f672920227758da80e878c8e9b883c883b625849d7a4c49804e5a95b7c" Dec 03 22:29:47 crc kubenswrapper[4830]: I1203 22:29:47.900916 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7zlj6" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.098354 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.098671 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-log" containerID="cri-o://a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08" gracePeriod=30 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.098842 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-api" containerID="cri-o://544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d" gracePeriod=30 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.123758 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.124071 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerName="nova-scheduler-scheduler" containerID="cri-o://8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" gracePeriod=30 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.128719 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.128962 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" containerID="cri-o://8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb" gracePeriod=30 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.129148 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" containerID="cri-o://0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261" gracePeriod=30 Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.602212 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.603762 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.605336 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.605378 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerName="nova-scheduler-scheduler" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.693866 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.694318 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="init" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.694341 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="init" Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.694363 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" containerName="nova-manage" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.694371 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" containerName="nova-manage" Dec 03 22:29:48 crc kubenswrapper[4830]: E1203 22:29:48.694414 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="dnsmasq-dns" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.694423 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="dnsmasq-dns" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.694696 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d17afd-4fea-430f-951a-98d40b505b9d" containerName="dnsmasq-dns" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.694731 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" containerName="nova-manage" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.696404 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.714460 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.800021 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.800087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.800325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dvs\" (UniqueName: \"kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.902210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.902278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.902359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dvs\" (UniqueName: \"kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.903258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.903270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.911972 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerID="8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb" exitCode=143 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.912034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerDied","Data":"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb"} Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.913963 4830 generic.go:334] "Generic (PLEG): container finished" podID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerID="a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08" exitCode=143 Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.914020 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerDied","Data":"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08"} Dec 03 22:29:48 crc kubenswrapper[4830]: I1203 22:29:48.938420 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dvs\" (UniqueName: \"kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs\") pod \"community-operators-q88bf\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:49 crc kubenswrapper[4830]: I1203 22:29:49.056429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:49 crc kubenswrapper[4830]: I1203 22:29:49.596259 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:29:49 crc kubenswrapper[4830]: I1203 22:29:49.926083 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerStarted","Data":"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50"} Dec 03 22:29:49 crc kubenswrapper[4830]: I1203 22:29:49.926499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerStarted","Data":"cdf1e31890faabbae89c144f2d2c51c691a573e72c4a5994b67e93b19174299b"} Dec 03 22:29:50 crc kubenswrapper[4830]: I1203 22:29:50.942249 4830 generic.go:334] "Generic (PLEG): container finished" podID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerID="b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50" exitCode=0 Dec 03 22:29:50 crc kubenswrapper[4830]: I1203 22:29:50.942296 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerDied","Data":"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50"} Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.923652 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.929205 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.966973 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerID="0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261" exitCode=0 Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.967033 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerDied","Data":"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261"} Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.967059 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e1d3561-51e0-4713-a94f-59f8dcce1e29","Type":"ContainerDied","Data":"268aa1086902b3172bdf8902e8478ed49719d91cba650088c9a24b06d1ca97de"} Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.967074 4830 scope.go:117] "RemoveContainer" containerID="0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261" Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.967192 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.997092 4830 generic.go:334] "Generic (PLEG): container finished" podID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerID="544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d" exitCode=0 Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.997184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerDied","Data":"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d"} Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.997218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f8d6eb7-7fa6-49cb-bb68-d32498837327","Type":"ContainerDied","Data":"b48e2898340efefaf29924468662da98ad805b818086a49bbf4a80bf7c01a7e9"} Dec 03 22:29:51 crc kubenswrapper[4830]: I1203 22:29:51.997283 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.001808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerStarted","Data":"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014"} Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.016798 4830 scope.go:117] "RemoveContainer" containerID="8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.078876 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data\") pod \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.078968 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079011 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079047 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079116 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle\") pod \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079137 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079461 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs" (OuterVolumeSpecName: "logs") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079531 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs\") pod \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079709 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpg9s\" (UniqueName: \"kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s\") pod \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\" (UID: \"2f8d6eb7-7fa6-49cb-bb68-d32498837327\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079739 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs\") pod \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.079867 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm4d\" (UniqueName: \"kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d\") pod \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\" (UID: \"5e1d3561-51e0-4713-a94f-59f8dcce1e29\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.080655 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8d6eb7-7fa6-49cb-bb68-d32498837327-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.085319 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs" (OuterVolumeSpecName: "logs") pod "5e1d3561-51e0-4713-a94f-59f8dcce1e29" (UID: "5e1d3561-51e0-4713-a94f-59f8dcce1e29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.085785 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d" (OuterVolumeSpecName: "kube-api-access-7mm4d") pod "5e1d3561-51e0-4713-a94f-59f8dcce1e29" (UID: "5e1d3561-51e0-4713-a94f-59f8dcce1e29"). InnerVolumeSpecName "kube-api-access-7mm4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.100798 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s" (OuterVolumeSpecName: "kube-api-access-kpg9s") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "kube-api-access-kpg9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.107646 4830 scope.go:117] "RemoveContainer" containerID="0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.110864 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261\": container with ID starting with 0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261 not found: ID does not exist" containerID="0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.110923 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261"} err="failed to get container status \"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261\": rpc error: code = NotFound desc = could not find container \"0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261\": container with ID starting with 0e81e746962b0b3b138ec9c64ecd9b9ebf1bf5d7ec7856b6de3f87cfe1318261 not found: ID does not exist" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.110958 4830 scope.go:117] "RemoveContainer" containerID="8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.111452 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb\": container with ID starting with 8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb not found: ID does not exist" containerID="8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.111478 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb"} err="failed to get container status \"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb\": rpc error: code = NotFound desc = could not find container \"8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb\": container with ID starting with 8a53c1b682ff0a6a9095f265cc1b3b03bc0b7bfd7647b1cd1a7a6313fe601cdb not found: ID does not exist" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.111494 4830 scope.go:117] "RemoveContainer" containerID="544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.113372 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data" (OuterVolumeSpecName: "config-data") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.122720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.130720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data" (OuterVolumeSpecName: "config-data") pod "5e1d3561-51e0-4713-a94f-59f8dcce1e29" (UID: "5e1d3561-51e0-4713-a94f-59f8dcce1e29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.134854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e1d3561-51e0-4713-a94f-59f8dcce1e29" (UID: "5e1d3561-51e0-4713-a94f-59f8dcce1e29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.167961 4830 scope.go:117] "RemoveContainer" containerID="a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.168644 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5e1d3561-51e0-4713-a94f-59f8dcce1e29" (UID: "5e1d3561-51e0-4713-a94f-59f8dcce1e29"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.174461 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.175346 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f8d6eb7-7fa6-49cb-bb68-d32498837327" (UID: "2f8d6eb7-7fa6-49cb-bb68-d32498837327"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183409 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm4d\" (UniqueName: \"kubernetes.io/projected/5e1d3561-51e0-4713-a94f-59f8dcce1e29-kube-api-access-7mm4d\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183442 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183455 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183466 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183479 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183489 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183518 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8d6eb7-7fa6-49cb-bb68-d32498837327-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183530 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1d3561-51e0-4713-a94f-59f8dcce1e29-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183542 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpg9s\" (UniqueName: \"kubernetes.io/projected/2f8d6eb7-7fa6-49cb-bb68-d32498837327-kube-api-access-kpg9s\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.183552 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e1d3561-51e0-4713-a94f-59f8dcce1e29-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.203931 4830 scope.go:117] "RemoveContainer" containerID="544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.204257 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d\": container with ID starting with 544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d not found: ID does not exist" containerID="544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.204302 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d"} err="failed to get container status \"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d\": rpc error: code = NotFound desc = could not find container \"544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d\": container with ID starting with 544e4a8688a7f275f19ee0c123675ca406347252bed6e71628110d8ae8e3561d not found: ID does not exist" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.204330 4830 scope.go:117] "RemoveContainer" containerID="a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.204733 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08\": container with ID starting with a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08 not found: ID does not exist" containerID="a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.204756 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08"} err="failed to get container status \"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08\": rpc error: code = NotFound desc = could not find container \"a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08\": container with ID starting with a827c78838cbf36509e64219c9fb9c0e639c3136373df90de9eba9e815a4cb08 not found: ID does not exist" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.323627 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.335351 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.344799 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.345279 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-api" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345300 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-api" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.345314 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345323 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.345341 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-log" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345347 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-log" Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.345362 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345368 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345566 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-log" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345597 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345611 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.345647 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" containerName="nova-api-api" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.347978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.352835 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.353716 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.354723 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.492178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.492231 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.492268 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-config-data\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.492300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4jqz\" (UniqueName: \"kubernetes.io/projected/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-kube-api-access-q4jqz\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.492340 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-logs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.586135 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.592012 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.596706 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.596753 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.596787 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-config-data\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.596821 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4jqz\" (UniqueName: \"kubernetes.io/projected/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-kube-api-access-q4jqz\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.596864 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-logs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.597464 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-logs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.607382 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-config-data\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.607451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.607705 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.613616 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.632931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4jqz\" (UniqueName: \"kubernetes.io/projected/b44e63ea-f87b-48f3-8af7-bc3e35ce5265-kube-api-access-q4jqz\") pod \"nova-metadata-0\" (UID: \"b44e63ea-f87b-48f3-8af7-bc3e35ce5265\") " pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.637587 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: E1203 22:29:52.638207 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerName="nova-scheduler-scheduler" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.638233 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerName="nova-scheduler-scheduler" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.638551 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerName="nova-scheduler-scheduler" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.640119 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.642312 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.642700 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.644193 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.646403 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.697886 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle\") pod \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.698025 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data\") pod \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.698117 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwpr\" (UniqueName: \"kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr\") pod \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\" (UID: \"b264ecc9-1aeb-4e6e-a342-73b4675c492a\") " Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.701856 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr" (OuterVolumeSpecName: "kube-api-access-sgwpr") pod "b264ecc9-1aeb-4e6e-a342-73b4675c492a" (UID: "b264ecc9-1aeb-4e6e-a342-73b4675c492a"). InnerVolumeSpecName "kube-api-access-sgwpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.726365 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b264ecc9-1aeb-4e6e-a342-73b4675c492a" (UID: "b264ecc9-1aeb-4e6e-a342-73b4675c492a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.726685 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data" (OuterVolumeSpecName: "config-data") pod "b264ecc9-1aeb-4e6e-a342-73b4675c492a" (UID: "b264ecc9-1aeb-4e6e-a342-73b4675c492a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.800539 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-config-data\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.800596 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.800645 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.800949 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-logs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.801118 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmhq\" (UniqueName: \"kubernetes.io/projected/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-kube-api-access-8cmhq\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.801190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-public-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.801411 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.801439 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b264ecc9-1aeb-4e6e-a342-73b4675c492a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.801536 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwpr\" (UniqueName: \"kubernetes.io/projected/b264ecc9-1aeb-4e6e-a342-73b4675c492a-kube-api-access-sgwpr\") on node \"crc\" DevicePath \"\"" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.885644 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.903740 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.903894 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-logs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.903949 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmhq\" (UniqueName: \"kubernetes.io/projected/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-kube-api-access-8cmhq\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.903985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-public-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.904066 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-config-data\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.904102 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.904760 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-logs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.907121 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.907137 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.908121 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-config-data\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.909820 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-public-tls-certs\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:52 crc kubenswrapper[4830]: I1203 22:29:52.922848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmhq\" (UniqueName: \"kubernetes.io/projected/22dd5f84-7ae4-442e-b6a1-dd27b2d3875d-kube-api-access-8cmhq\") pod \"nova-api-0\" (UID: \"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d\") " pod="openstack/nova-api-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.017328 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.034074 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.034127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b264ecc9-1aeb-4e6e-a342-73b4675c492a","Type":"ContainerDied","Data":"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450"} Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.034167 4830 scope.go:117] "RemoveContainer" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.034099 4830 generic.go:334] "Generic (PLEG): container finished" podID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" exitCode=0 Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.034364 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b264ecc9-1aeb-4e6e-a342-73b4675c492a","Type":"ContainerDied","Data":"2f2bf73bf72078c7dcf91e1d8af9a73fdde6792e6a59131bbe3b8a1f2e412099"} Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.039442 4830 generic.go:334] "Generic (PLEG): container finished" podID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerID="07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014" exitCode=0 Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.039481 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerDied","Data":"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014"} Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.098019 4830 scope.go:117] "RemoveContainer" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" Dec 03 22:29:53 crc kubenswrapper[4830]: E1203 22:29:53.098791 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450\": container with ID starting with 8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450 not found: ID does not exist" containerID="8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.098831 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450"} err="failed to get container status \"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450\": rpc error: code = NotFound desc = could not find container \"8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450\": container with ID starting with 8846fa9866377e1def3e6351a4b01a874db564a06379f8f0e813195567c25450 not found: ID does not exist" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.105617 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.118325 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.133672 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.139301 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.141378 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.154137 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.310950 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.310986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvmt\" (UniqueName: \"kubernetes.io/projected/4dd8f4ca-85be-49df-b95a-adb609cbbff2-kube-api-access-xfvmt\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.311056 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-config-data\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.351185 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8d6eb7-7fa6-49cb-bb68-d32498837327" path="/var/lib/kubelet/pods/2f8d6eb7-7fa6-49cb-bb68-d32498837327/volumes" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.352248 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" path="/var/lib/kubelet/pods/5e1d3561-51e0-4713-a94f-59f8dcce1e29/volumes" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.353837 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b264ecc9-1aeb-4e6e-a342-73b4675c492a" path="/var/lib/kubelet/pods/b264ecc9-1aeb-4e6e-a342-73b4675c492a/volumes" Dec 03 22:29:53 crc kubenswrapper[4830]: W1203 22:29:53.369738 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb44e63ea_f87b_48f3_8af7_bc3e35ce5265.slice/crio-1146a7fb042c017b8077927c06a39a62754d0865f4232e5a0e6490b23594148c WatchSource:0}: Error finding container 1146a7fb042c017b8077927c06a39a62754d0865f4232e5a0e6490b23594148c: Status 404 returned error can't find the container with id 1146a7fb042c017b8077927c06a39a62754d0865f4232e5a0e6490b23594148c Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.375293 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.413983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.414028 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvmt\" (UniqueName: \"kubernetes.io/projected/4dd8f4ca-85be-49df-b95a-adb609cbbff2-kube-api-access-xfvmt\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.414105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-config-data\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.419866 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-config-data\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.420481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd8f4ca-85be-49df-b95a-adb609cbbff2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.430412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvmt\" (UniqueName: \"kubernetes.io/projected/4dd8f4ca-85be-49df-b95a-adb609cbbff2-kube-api-access-xfvmt\") pod \"nova-scheduler-0\" (UID: \"4dd8f4ca-85be-49df-b95a-adb609cbbff2\") " pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.461733 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.568438 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:29:53 crc kubenswrapper[4830]: W1203 22:29:53.579318 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22dd5f84_7ae4_442e_b6a1_dd27b2d3875d.slice/crio-f180c4220f1e5ad7d9b4854eef143344a918620c4df76b20c58daf223ee8c3f8 WatchSource:0}: Error finding container f180c4220f1e5ad7d9b4854eef143344a918620c4df76b20c58daf223ee8c3f8: Status 404 returned error can't find the container with id f180c4220f1e5ad7d9b4854eef143344a918620c4df76b20c58daf223ee8c3f8 Dec 03 22:29:53 crc kubenswrapper[4830]: I1203 22:29:53.965703 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.064581 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4dd8f4ca-85be-49df-b95a-adb609cbbff2","Type":"ContainerStarted","Data":"27c9a3a447ac9f74a865b9a982eee4bd7edfa284de6bf1604f99fc9fd22531d4"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.066787 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerStarted","Data":"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.073300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b44e63ea-f87b-48f3-8af7-bc3e35ce5265","Type":"ContainerStarted","Data":"75849f830494bc930dcf88ea2f34bef6e8a3551559eef209108274c01604cee5"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.073340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b44e63ea-f87b-48f3-8af7-bc3e35ce5265","Type":"ContainerStarted","Data":"c0a19189d0f59199d934882d643da82492ba6860b78c1ea6d7407988c9669ed5"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.073353 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b44e63ea-f87b-48f3-8af7-bc3e35ce5265","Type":"ContainerStarted","Data":"1146a7fb042c017b8077927c06a39a62754d0865f4232e5a0e6490b23594148c"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.077871 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d","Type":"ContainerStarted","Data":"92a211a8ac5579bc4b8a9010ff7a39c89d2d68226607353c140218d5735fe78a"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.077914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d","Type":"ContainerStarted","Data":"538661ba331a038992a5fe7a4fb9a44ea2d9ce6fb6e780eb8fcdc5c972f3665c"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.077926 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22dd5f84-7ae4-442e-b6a1-dd27b2d3875d","Type":"ContainerStarted","Data":"f180c4220f1e5ad7d9b4854eef143344a918620c4df76b20c58daf223ee8c3f8"} Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.093589 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q88bf" podStartSLOduration=3.566049475 podStartE2EDuration="6.093570874s" podCreationTimestamp="2025-12-03 22:29:48 +0000 UTC" firstStartedPulling="2025-12-03 22:29:50.947280501 +0000 UTC m=+1479.943741880" lastFinishedPulling="2025-12-03 22:29:53.47480191 +0000 UTC m=+1482.471263279" observedRunningTime="2025-12-03 22:29:54.084874979 +0000 UTC m=+1483.081336328" watchObservedRunningTime="2025-12-03 22:29:54.093570874 +0000 UTC m=+1483.090032223" Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.117893 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.117873302 podStartE2EDuration="2.117873302s" podCreationTimestamp="2025-12-03 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:54.101420737 +0000 UTC m=+1483.097882086" watchObservedRunningTime="2025-12-03 22:29:54.117873302 +0000 UTC m=+1483.114334651" Dec 03 22:29:54 crc kubenswrapper[4830]: I1203 22:29:54.129866 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.129829635 podStartE2EDuration="2.129829635s" podCreationTimestamp="2025-12-03 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:54.116490885 +0000 UTC m=+1483.112952234" watchObservedRunningTime="2025-12-03 22:29:54.129829635 +0000 UTC m=+1483.126290984" Dec 03 22:29:55 crc kubenswrapper[4830]: I1203 22:29:55.095600 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4dd8f4ca-85be-49df-b95a-adb609cbbff2","Type":"ContainerStarted","Data":"00b3b0d7b850f05428a00bd32e279b5882594ae53b9a9d1264433d15c83eee5d"} Dec 03 22:29:55 crc kubenswrapper[4830]: I1203 22:29:55.115647 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.115630206 podStartE2EDuration="2.115630206s" podCreationTimestamp="2025-12-03 22:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:29:55.110357103 +0000 UTC m=+1484.106818462" watchObservedRunningTime="2025-12-03 22:29:55.115630206 +0000 UTC m=+1484.112091545" Dec 03 22:29:56 crc kubenswrapper[4830]: I1203 22:29:56.902237 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:56 crc kubenswrapper[4830]: I1203 22:29:56.902279 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e1d3561-51e0-4713-a94f-59f8dcce1e29" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:57 crc kubenswrapper[4830]: I1203 22:29:57.885969 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:29:57 crc kubenswrapper[4830]: I1203 22:29:57.886346 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:29:58 crc kubenswrapper[4830]: I1203 22:29:58.462781 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.050039 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.056810 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.057116 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.129461 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.210702 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:29:59 crc kubenswrapper[4830]: I1203 22:29:59.370463 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.170320 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4"] Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.172503 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.179572 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.179833 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.186698 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4"] Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.282712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.282856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.283005 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lxhs\" (UniqueName: \"kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.385230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.385361 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lxhs\" (UniqueName: \"kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.385424 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.386210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.399726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.405019 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lxhs\" (UniqueName: \"kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs\") pod \"collect-profiles-29413350-lfdq4\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:00 crc kubenswrapper[4830]: I1203 22:30:00.506652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.028208 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4"] Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.175360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" event={"ID":"f7b0d2fb-1190-46dc-ad5f-f6435077bf11","Type":"ContainerStarted","Data":"8e07401587881d7b2a0c3e7e0dae44902cb52276cd5c0ce028ab724b04a3bfb7"} Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.175546 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q88bf" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="registry-server" containerID="cri-o://485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5" gracePeriod=2 Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.823725 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.919484 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities\") pod \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.919648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7dvs\" (UniqueName: \"kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs\") pod \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.919736 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content\") pod \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\" (UID: \"0351bdb5-464e-4a70-8c3f-ea5620cc0c31\") " Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.920923 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities" (OuterVolumeSpecName: "utilities") pod "0351bdb5-464e-4a70-8c3f-ea5620cc0c31" (UID: "0351bdb5-464e-4a70-8c3f-ea5620cc0c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.926527 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs" (OuterVolumeSpecName: "kube-api-access-w7dvs") pod "0351bdb5-464e-4a70-8c3f-ea5620cc0c31" (UID: "0351bdb5-464e-4a70-8c3f-ea5620cc0c31"). InnerVolumeSpecName "kube-api-access-w7dvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:01 crc kubenswrapper[4830]: I1203 22:30:01.971563 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0351bdb5-464e-4a70-8c3f-ea5620cc0c31" (UID: "0351bdb5-464e-4a70-8c3f-ea5620cc0c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.021831 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.021880 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7dvs\" (UniqueName: \"kubernetes.io/projected/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-kube-api-access-w7dvs\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.021891 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0351bdb5-464e-4a70-8c3f-ea5620cc0c31-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.197753 4830 generic.go:334] "Generic (PLEG): container finished" podID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerID="485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5" exitCode=0 Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.197834 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerDied","Data":"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5"} Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.197868 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q88bf" event={"ID":"0351bdb5-464e-4a70-8c3f-ea5620cc0c31","Type":"ContainerDied","Data":"cdf1e31890faabbae89c144f2d2c51c691a573e72c4a5994b67e93b19174299b"} Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.197890 4830 scope.go:117] "RemoveContainer" containerID="485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.198057 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q88bf" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.204032 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7b0d2fb-1190-46dc-ad5f-f6435077bf11" containerID="759271d4429d557e2e0f563bf3248ff047c19d3c74cec15f53039fef9088820e" exitCode=0 Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.204094 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" event={"ID":"f7b0d2fb-1190-46dc-ad5f-f6435077bf11","Type":"ContainerDied","Data":"759271d4429d557e2e0f563bf3248ff047c19d3c74cec15f53039fef9088820e"} Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.232378 4830 scope.go:117] "RemoveContainer" containerID="07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.256820 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.265917 4830 scope.go:117] "RemoveContainer" containerID="b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.269359 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q88bf"] Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.302849 4830 scope.go:117] "RemoveContainer" containerID="485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5" Dec 03 22:30:02 crc kubenswrapper[4830]: E1203 22:30:02.303239 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5\": container with ID starting with 485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5 not found: ID does not exist" containerID="485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.303276 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5"} err="failed to get container status \"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5\": rpc error: code = NotFound desc = could not find container \"485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5\": container with ID starting with 485c54f08c1d3552511f846480ba57c310db152dc113ea2a08c5e5b9bc0398e5 not found: ID does not exist" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.303297 4830 scope.go:117] "RemoveContainer" containerID="07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014" Dec 03 22:30:02 crc kubenswrapper[4830]: E1203 22:30:02.303561 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014\": container with ID starting with 07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014 not found: ID does not exist" containerID="07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.303589 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014"} err="failed to get container status \"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014\": rpc error: code = NotFound desc = could not find container \"07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014\": container with ID starting with 07e8edb053bb24c0a6778816087a111c48c6a86c7f0b3d0863cce5621c771014 not found: ID does not exist" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.303606 4830 scope.go:117] "RemoveContainer" containerID="b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50" Dec 03 22:30:02 crc kubenswrapper[4830]: E1203 22:30:02.303826 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50\": container with ID starting with b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50 not found: ID does not exist" containerID="b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.303858 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50"} err="failed to get container status \"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50\": rpc error: code = NotFound desc = could not find container \"b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50\": container with ID starting with b7c44b28dbb9072c6599ae87df5be43e4e284092307a82d11b24447321a58d50 not found: ID does not exist" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.886149 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:30:02 crc kubenswrapper[4830]: I1203 22:30:02.886600 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.018427 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.018543 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.347879 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" path="/var/lib/kubelet/pods/0351bdb5-464e-4a70-8c3f-ea5620cc0c31/volumes" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.463040 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.517146 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.711182 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.865090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume\") pod \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.865143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume\") pod \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.865268 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lxhs\" (UniqueName: \"kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs\") pod \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\" (UID: \"f7b0d2fb-1190-46dc-ad5f-f6435077bf11\") " Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.865954 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7b0d2fb-1190-46dc-ad5f-f6435077bf11" (UID: "f7b0d2fb-1190-46dc-ad5f-f6435077bf11"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.882909 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7b0d2fb-1190-46dc-ad5f-f6435077bf11" (UID: "f7b0d2fb-1190-46dc-ad5f-f6435077bf11"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.894710 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs" (OuterVolumeSpecName: "kube-api-access-8lxhs") pod "f7b0d2fb-1190-46dc-ad5f-f6435077bf11" (UID: "f7b0d2fb-1190-46dc-ad5f-f6435077bf11"). InnerVolumeSpecName "kube-api-access-8lxhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.900719 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b44e63ea-f87b-48f3-8af7-bc3e35ce5265" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.900719 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b44e63ea-f87b-48f3-8af7-bc3e35ce5265" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.967151 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lxhs\" (UniqueName: \"kubernetes.io/projected/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-kube-api-access-8lxhs\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.967182 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:03 crc kubenswrapper[4830]: I1203 22:30:03.967193 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7b0d2fb-1190-46dc-ad5f-f6435077bf11-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.033745 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22dd5f84-7ae4-442e-b6a1-dd27b2d3875d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.033759 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22dd5f84-7ae4-442e-b6a1-dd27b2d3875d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.227976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" event={"ID":"f7b0d2fb-1190-46dc-ad5f-f6435077bf11","Type":"ContainerDied","Data":"8e07401587881d7b2a0c3e7e0dae44902cb52276cd5c0ce028ab724b04a3bfb7"} Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.228011 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e07401587881d7b2a0c3e7e0dae44902cb52276cd5c0ce028ab724b04a3bfb7" Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.228010 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4" Dec 03 22:30:04 crc kubenswrapper[4830]: I1203 22:30:04.263909 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.253175 4830 generic.go:334] "Generic (PLEG): container finished" podID="029f840e-def3-45b4-a109-6769ec5f64db" containerID="9ed2b8a980b371638c2496213d6a5f2b6faade6e2cd927aa1e9578cd86308511" exitCode=137 Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.253315 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerDied","Data":"9ed2b8a980b371638c2496213d6a5f2b6faade6e2cd927aa1e9578cd86308511"} Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.253892 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"029f840e-def3-45b4-a109-6769ec5f64db","Type":"ContainerDied","Data":"7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a"} Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.253919 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf24cf436c7c8bb92e12d9e080732dcc56e2d0dfa1043027f610f1c19384e2a" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.299840 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.413786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.413844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414039 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdvmh\" (UniqueName: \"kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414067 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414131 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414195 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414248 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml\") pod \"029f840e-def3-45b4-a109-6769ec5f64db\" (UID: \"029f840e-def3-45b4-a109-6769ec5f64db\") " Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.414959 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.415593 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.419813 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts" (OuterVolumeSpecName: "scripts") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.432245 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh" (OuterVolumeSpecName: "kube-api-access-kdvmh") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "kube-api-access-kdvmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.453283 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.487062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.516928 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdvmh\" (UniqueName: \"kubernetes.io/projected/029f840e-def3-45b4-a109-6769ec5f64db-kube-api-access-kdvmh\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.516963 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.516976 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.516987 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.516997 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.517007 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/029f840e-def3-45b4-a109-6769ec5f64db-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.529628 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.560732 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data" (OuterVolumeSpecName: "config-data") pod "029f840e-def3-45b4-a109-6769ec5f64db" (UID: "029f840e-def3-45b4-a109-6769ec5f64db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.618379 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:06 crc kubenswrapper[4830]: I1203 22:30:06.618716 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/029f840e-def3-45b4-a109-6769ec5f64db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.263414 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.305885 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.319043 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.363310 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029f840e-def3-45b4-a109-6769ec5f64db" path="/var/lib/kubelet/pods/029f840e-def3-45b4-a109-6769ec5f64db/volumes" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.364291 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367342 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="proxy-httpd" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367367 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="proxy-httpd" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367402 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b0d2fb-1190-46dc-ad5f-f6435077bf11" containerName="collect-profiles" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367409 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b0d2fb-1190-46dc-ad5f-f6435077bf11" containerName="collect-profiles" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367423 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="sg-core" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367428 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="sg-core" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367444 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="registry-server" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367450 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="registry-server" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367736 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-notification-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367744 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-notification-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367753 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-central-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367759 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-central-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367771 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="extract-content" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367778 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="extract-content" Dec 03 22:30:07 crc kubenswrapper[4830]: E1203 22:30:07.367791 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="extract-utilities" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.367797 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="extract-utilities" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368125 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0351bdb5-464e-4a70-8c3f-ea5620cc0c31" containerName="registry-server" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368148 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="proxy-httpd" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368164 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="sg-core" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368177 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b0d2fb-1190-46dc-ad5f-f6435077bf11" containerName="collect-profiles" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368188 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-notification-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.368199 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="029f840e-def3-45b4-a109-6769ec5f64db" containerName="ceilometer-central-agent" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.370862 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.374150 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.374297 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.374565 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.379748 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535073 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535149 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535195 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535522 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcwq\" (UniqueName: \"kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.535672 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638403 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638477 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638722 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.638912 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.639124 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.640023 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.640066 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcwq\" (UniqueName: \"kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.642177 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.647494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.651194 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.653328 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.657294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.660288 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.675226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcwq\" (UniqueName: \"kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq\") pod \"ceilometer-0\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " pod="openstack/ceilometer-0" Dec 03 22:30:07 crc kubenswrapper[4830]: I1203 22:30:07.687176 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:08 crc kubenswrapper[4830]: I1203 22:30:08.213450 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:08 crc kubenswrapper[4830]: W1203 22:30:08.214672 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f16bf5_4efa_4438_b6b3_3a16377239a6.slice/crio-788fae9f6f76c3d2c9ce579464fd35c4a79ac4a2bd7e73b56ebdb2b97f5bf4f2 WatchSource:0}: Error finding container 788fae9f6f76c3d2c9ce579464fd35c4a79ac4a2bd7e73b56ebdb2b97f5bf4f2: Status 404 returned error can't find the container with id 788fae9f6f76c3d2c9ce579464fd35c4a79ac4a2bd7e73b56ebdb2b97f5bf4f2 Dec 03 22:30:08 crc kubenswrapper[4830]: I1203 22:30:08.278033 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerStarted","Data":"788fae9f6f76c3d2c9ce579464fd35c4a79ac4a2bd7e73b56ebdb2b97f5bf4f2"} Dec 03 22:30:09 crc kubenswrapper[4830]: I1203 22:30:09.289317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerStarted","Data":"7fa7123e51beab7181b4395509dd264c268c667ea8e584a8ae337d6179a5c08f"} Dec 03 22:30:10 crc kubenswrapper[4830]: I1203 22:30:10.301006 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerStarted","Data":"262abade23144cfe38b6e023e70f8beaaec8be9dcab81314c8f6eb5d2c9ab55c"} Dec 03 22:30:11 crc kubenswrapper[4830]: I1203 22:30:11.311355 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerStarted","Data":"a5759f70e1c431d817c4bbc4b6e28b9d81558661a7a2d4ae6bfc4a6a4f087a5d"} Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.324176 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerStarted","Data":"9bf93171e4c09d1ee7483eac71a63f0bfb13c074ed85b9e6d297c2bc86357bdb"} Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.325354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.358479 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.956797305 podStartE2EDuration="5.358455615s" podCreationTimestamp="2025-12-03 22:30:07 +0000 UTC" firstStartedPulling="2025-12-03 22:30:08.218261102 +0000 UTC m=+1497.214722471" lastFinishedPulling="2025-12-03 22:30:11.619919422 +0000 UTC m=+1500.616380781" observedRunningTime="2025-12-03 22:30:12.344792275 +0000 UTC m=+1501.341253644" watchObservedRunningTime="2025-12-03 22:30:12.358455615 +0000 UTC m=+1501.354916974" Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.893066 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.895257 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:30:12 crc kubenswrapper[4830]: I1203 22:30:12.898646 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.029297 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.030342 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.034970 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.039398 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.335068 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.348121 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:30:13 crc kubenswrapper[4830]: I1203 22:30:13.353612 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:30:37 crc kubenswrapper[4830]: I1203 22:30:37.698655 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.169460 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-67qfl"] Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.184595 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-67qfl"] Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.291080 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-4v4rm"] Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.292310 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.294919 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.325052 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4v4rm"] Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.361521 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f0abb0-964c-42d5-8a2d-2cdf84d049c7" path="/var/lib/kubelet/pods/d4f0abb0-964c-42d5-8a2d-2cdf84d049c7/volumes" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.388746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.388800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.388861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.388894 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.388919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97nt\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.490515 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.490572 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.490602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97nt\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.490728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.490758 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.497434 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.498902 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.502072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.511047 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.513077 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g97nt\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt\") pod \"cloudkitty-db-sync-4v4rm\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:49 crc kubenswrapper[4830]: I1203 22:30:49.608220 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:30:50 crc kubenswrapper[4830]: I1203 22:30:50.123203 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4v4rm"] Dec 03 22:30:50 crc kubenswrapper[4830]: I1203 22:30:50.823499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4v4rm" event={"ID":"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6","Type":"ContainerStarted","Data":"448af7412e86918b5e76553acca326e6a0ef587b74745778dde2777faa440ca3"} Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.215098 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.215375 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-central-agent" containerID="cri-o://7fa7123e51beab7181b4395509dd264c268c667ea8e584a8ae337d6179a5c08f" gracePeriod=30 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.215561 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="sg-core" containerID="cri-o://a5759f70e1c431d817c4bbc4b6e28b9d81558661a7a2d4ae6bfc4a6a4f087a5d" gracePeriod=30 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.215665 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="proxy-httpd" containerID="cri-o://9bf93171e4c09d1ee7483eac71a63f0bfb13c074ed85b9e6d297c2bc86357bdb" gracePeriod=30 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.215498 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-notification-agent" containerID="cri-o://262abade23144cfe38b6e023e70f8beaaec8be9dcab81314c8f6eb5d2c9ab55c" gracePeriod=30 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.548872 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857755 4830 generic.go:334] "Generic (PLEG): container finished" podID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerID="9bf93171e4c09d1ee7483eac71a63f0bfb13c074ed85b9e6d297c2bc86357bdb" exitCode=0 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857794 4830 generic.go:334] "Generic (PLEG): container finished" podID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerID="a5759f70e1c431d817c4bbc4b6e28b9d81558661a7a2d4ae6bfc4a6a4f087a5d" exitCode=2 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857804 4830 generic.go:334] "Generic (PLEG): container finished" podID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerID="7fa7123e51beab7181b4395509dd264c268c667ea8e584a8ae337d6179a5c08f" exitCode=0 Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerDied","Data":"9bf93171e4c09d1ee7483eac71a63f0bfb13c074ed85b9e6d297c2bc86357bdb"} Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857858 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerDied","Data":"a5759f70e1c431d817c4bbc4b6e28b9d81558661a7a2d4ae6bfc4a6a4f087a5d"} Dec 03 22:30:51 crc kubenswrapper[4830]: I1203 22:30:51.857871 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerDied","Data":"7fa7123e51beab7181b4395509dd264c268c667ea8e584a8ae337d6179a5c08f"} Dec 03 22:30:52 crc kubenswrapper[4830]: I1203 22:30:52.569662 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:30:52 crc kubenswrapper[4830]: I1203 22:30:52.896775 4830 generic.go:334] "Generic (PLEG): container finished" podID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerID="262abade23144cfe38b6e023e70f8beaaec8be9dcab81314c8f6eb5d2c9ab55c" exitCode=0 Dec 03 22:30:52 crc kubenswrapper[4830]: I1203 22:30:52.896823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerDied","Data":"262abade23144cfe38b6e023e70f8beaaec8be9dcab81314c8f6eb5d2c9ab55c"} Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.127611 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.270959 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.271029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxcwq\" (UniqueName: \"kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.272706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.272735 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.272764 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.272920 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.272981 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.273019 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml\") pod \"15f16bf5-4efa-4438-b6b3-3a16377239a6\" (UID: \"15f16bf5-4efa-4438-b6b3-3a16377239a6\") " Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.273084 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.273864 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.277254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.280203 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts" (OuterVolumeSpecName: "scripts") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.280313 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq" (OuterVolumeSpecName: "kube-api-access-cxcwq") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "kube-api-access-cxcwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.323409 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.334042 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.378643 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.378792 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.378847 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxcwq\" (UniqueName: \"kubernetes.io/projected/15f16bf5-4efa-4438-b6b3-3a16377239a6-kube-api-access-cxcwq\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.378907 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15f16bf5-4efa-4438-b6b3-3a16377239a6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.378960 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.387596 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data" (OuterVolumeSpecName: "config-data") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.390055 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15f16bf5-4efa-4438-b6b3-3a16377239a6" (UID: "15f16bf5-4efa-4438-b6b3-3a16377239a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.480851 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.480880 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f16bf5-4efa-4438-b6b3-3a16377239a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.920329 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15f16bf5-4efa-4438-b6b3-3a16377239a6","Type":"ContainerDied","Data":"788fae9f6f76c3d2c9ce579464fd35c4a79ac4a2bd7e73b56ebdb2b97f5bf4f2"} Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.920771 4830 scope.go:117] "RemoveContainer" containerID="9bf93171e4c09d1ee7483eac71a63f0bfb13c074ed85b9e6d297c2bc86357bdb" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.920388 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.956708 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.968164 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:53 crc kubenswrapper[4830]: I1203 22:30:53.990597 4830 scope.go:117] "RemoveContainer" containerID="a5759f70e1c431d817c4bbc4b6e28b9d81558661a7a2d4ae6bfc4a6a4f087a5d" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.023187 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:54 crc kubenswrapper[4830]: E1203 22:30:54.024135 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-notification-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.024156 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-notification-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: E1203 22:30:54.036603 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="sg-core" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.036625 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="sg-core" Dec 03 22:30:54 crc kubenswrapper[4830]: E1203 22:30:54.036658 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="proxy-httpd" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.036668 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="proxy-httpd" Dec 03 22:30:54 crc kubenswrapper[4830]: E1203 22:30:54.036704 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-central-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.036712 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-central-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.037814 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-notification-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.037872 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="sg-core" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.037926 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="proxy-httpd" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.037940 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" containerName="ceilometer-central-agent" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.041959 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.055963 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.058221 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.058346 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.058429 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.074813 4830 scope.go:117] "RemoveContainer" containerID="262abade23144cfe38b6e023e70f8beaaec8be9dcab81314c8f6eb5d2c9ab55c" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.132806 4830 scope.go:117] "RemoveContainer" containerID="7fa7123e51beab7181b4395509dd264c268c667ea8e584a8ae337d6179a5c08f" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198074 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-log-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198108 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-config-data\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198152 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-scripts\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198529 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198704 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-run-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bh7x\" (UniqueName: \"kubernetes.io/projected/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-kube-api-access-9bh7x\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.198744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300778 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-run-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300827 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bh7x\" (UniqueName: \"kubernetes.io/projected/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-kube-api-access-9bh7x\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300935 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-log-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-config-data\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.300994 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.301017 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-scripts\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.301103 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.301750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-log-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.302379 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-run-httpd\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.308525 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.310388 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-scripts\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.310533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.320674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-config-data\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.327425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bh7x\" (UniqueName: \"kubernetes.io/projected/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-kube-api-access-9bh7x\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.335161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf\") " pod="openstack/ceilometer-0" Dec 03 22:30:54 crc kubenswrapper[4830]: I1203 22:30:54.371210 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:55 crc kubenswrapper[4830]: I1203 22:30:55.138493 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:55 crc kubenswrapper[4830]: I1203 22:30:55.353186 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f16bf5-4efa-4438-b6b3-3a16377239a6" path="/var/lib/kubelet/pods/15f16bf5-4efa-4438-b6b3-3a16377239a6/volumes" Dec 03 22:30:55 crc kubenswrapper[4830]: I1203 22:30:55.962286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf","Type":"ContainerStarted","Data":"0dcfb5610ed4a8252383cfa14df68e064fed2e3b96c54cff27538ef71f372683"} Dec 03 22:30:56 crc kubenswrapper[4830]: I1203 22:30:56.717712 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="rabbitmq" containerID="cri-o://f1c3050649d45ba05f8c7e16d94979bee4c4861b21f455826e9557e3bcc7ac7e" gracePeriod=604795 Dec 03 22:30:57 crc kubenswrapper[4830]: I1203 22:30:57.293884 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" containerID="cri-o://accd5ab84c124d6c490627992a78ee18f80188c66c763bb70f6a9af31dbbe222" gracePeriod=604796 Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.808204 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.812342 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.825692 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.925053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.925107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7st47\" (UniqueName: \"kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:30:59 crc kubenswrapper[4830]: I1203 22:30:59.925134 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.027258 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.027318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7st47\" (UniqueName: \"kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.027349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.027842 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.027876 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.046489 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7st47\" (UniqueName: \"kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47\") pod \"certified-operators-mh6hf\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.153071 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:00 crc kubenswrapper[4830]: I1203 22:31:00.309926 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Dec 03 22:31:01 crc kubenswrapper[4830]: I1203 22:31:01.658441 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.093365 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerID="f1c3050649d45ba05f8c7e16d94979bee4c4861b21f455826e9557e3bcc7ac7e" exitCode=0 Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.093549 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerDied","Data":"f1c3050649d45ba05f8c7e16d94979bee4c4861b21f455826e9557e3bcc7ac7e"} Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.878139 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.882181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.888118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.897229 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941668 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941777 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941810 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941895 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgrc\" (UniqueName: \"kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941925 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:04 crc kubenswrapper[4830]: I1203 22:31:04.941981 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.043860 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.043927 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.044022 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgrc\" (UniqueName: \"kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.044053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.044110 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.044139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.044162 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.045260 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.045953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.046628 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.047292 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.047574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.047926 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.064542 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgrc\" (UniqueName: \"kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc\") pod \"dnsmasq-dns-dbb88bf8c-8zh8s\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.113134 4830 generic.go:334] "Generic (PLEG): container finished" podID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerID="accd5ab84c124d6c490627992a78ee18f80188c66c763bb70f6a9af31dbbe222" exitCode=0 Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.113177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerDied","Data":"accd5ab84c124d6c490627992a78ee18f80188c66c763bb70f6a9af31dbbe222"} Dec 03 22:31:05 crc kubenswrapper[4830]: I1203 22:31:05.203361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.758537 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.829449 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.829489 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.829668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.829707 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.829771 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnftw\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830098 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830326 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830520 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830552 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info\") pod \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\" (UID: \"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1\") " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.830643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.831326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.832068 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.832088 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.833598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.836352 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw" (OuterVolumeSpecName: "kube-api-access-tnftw") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "kube-api-access-tnftw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.858860 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.862029 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.872253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info" (OuterVolumeSpecName: "pod-info") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.903542 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a" (OuterVolumeSpecName: "persistence") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.928291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data" (OuterVolumeSpecName: "config-data") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934641 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934681 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934690 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934699 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934708 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934719 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnftw\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-kube-api-access-tnftw\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.934755 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") on node \"crc\" " Dec 03 22:31:08 crc kubenswrapper[4830]: I1203 22:31:08.964976 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf" (OuterVolumeSpecName: "server-conf") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.047606 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.069932 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" (UID: "1d294aa0-bf67-4fc4-ad99-fda0ddd054d1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.149201 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.167147 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d294aa0-bf67-4fc4-ad99-fda0ddd054d1","Type":"ContainerDied","Data":"ba29a15596d43c544ee5fb22add28ad2e5393a7cf2b852dab02453018ff54652"} Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.167486 4830 scope.go:117] "RemoveContainer" containerID="f1c3050649d45ba05f8c7e16d94979bee4c4861b21f455826e9557e3bcc7ac7e" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.167633 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.250304 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.268942 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.305604 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:31:09 crc kubenswrapper[4830]: E1203 22:31:09.306086 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="setup-container" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.306099 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="setup-container" Dec 03 22:31:09 crc kubenswrapper[4830]: E1203 22:31:09.306113 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="rabbitmq" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.306119 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="rabbitmq" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.307917 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" containerName="rabbitmq" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.338557 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.338746 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a") on node "crc" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.340692 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.343667 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.356151 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.361639 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.361743 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.362713 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.362907 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.363055 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d8d68" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.363172 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.363284 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.387944 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d294aa0-bf67-4fc4-ad99-fda0ddd054d1" path="/var/lib/kubelet/pods/1d294aa0-bf67-4fc4-ad99-fda0ddd054d1/volumes" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457666 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5a2c4f61-6b61-4907-8601-6eea8065d2f6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457789 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457824 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457841 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7gq\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-kube-api-access-mk7gq\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457910 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5a2c4f61-6b61-4907-8601-6eea8065d2f6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.457970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.458010 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560341 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560426 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7gq\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-kube-api-access-mk7gq\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560548 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5a2c4f61-6b61-4907-8601-6eea8065d2f6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560668 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560739 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560787 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5a2c4f61-6b61-4907-8601-6eea8065d2f6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.560818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.561413 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.561746 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.563352 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.563531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-config-data\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.566611 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.567896 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5a2c4f61-6b61-4907-8601-6eea8065d2f6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.568231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.568472 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5a2c4f61-6b61-4907-8601-6eea8065d2f6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.570175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5a2c4f61-6b61-4907-8601-6eea8065d2f6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.578727 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.578796 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e508f47409051aeeca58235538460d6ef03a80bec1be13d5a3f95b5d169f15a1/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.596537 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7gq\" (UniqueName: \"kubernetes.io/projected/5a2c4f61-6b61-4907-8601-6eea8065d2f6-kube-api-access-mk7gq\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.693339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b10f0963-1dc2-4a38-86c5-0c9c1d9b396a\") pod \"rabbitmq-server-0\" (UID: \"5a2c4f61-6b61-4907-8601-6eea8065d2f6\") " pod="openstack/rabbitmq-server-0" Dec 03 22:31:09 crc kubenswrapper[4830]: I1203 22:31:09.705750 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.823591 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.952945 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.952994 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953030 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953065 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953084 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953927 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.953994 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.954070 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl9qq\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.954091 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.954123 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info\") pod \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\" (UID: \"3fc13f96-b9cf-4e92-bbe6-2c3719041e59\") " Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.954330 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.954919 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.958303 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.961047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info" (OuterVolumeSpecName: "pod-info") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.961881 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.967558 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.982077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.985528 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq" (OuterVolumeSpecName: "kube-api-access-jl9qq") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "kube-api-access-jl9qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:13 crc kubenswrapper[4830]: I1203 22:31:13.999105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea" (OuterVolumeSpecName: "persistence") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "pvc-16809d31-3801-4939-90b2-8372afe3cbea". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.032216 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data" (OuterVolumeSpecName: "config-data") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056099 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl9qq\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-kube-api-access-jl9qq\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056127 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056156 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056166 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056175 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056183 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056209 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") on node \"crc\" " Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.056220 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.069343 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf" (OuterVolumeSpecName: "server-conf") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.100217 4830 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.100418 4830 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-16809d31-3801-4939-90b2-8372afe3cbea" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea") on node "crc" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.114035 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3fc13f96-b9cf-4e92-bbe6-2c3719041e59" (UID: "3fc13f96-b9cf-4e92-bbe6-2c3719041e59"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.158283 4830 reconciler_common.go:293] "Volume detached for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.158312 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.158325 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3fc13f96-b9cf-4e92-bbe6-2c3719041e59-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.250268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3fc13f96-b9cf-4e92-bbe6-2c3719041e59","Type":"ContainerDied","Data":"c3c73ec921fa2aa67e815b9cc1f0737a0b8860fa25edfe6a9c662b7c45cfea09"} Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.250368 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.284547 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.296086 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.305407 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.306046 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="setup-container" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.306112 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="setup-container" Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.306170 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.306220 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.306454 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.307814 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313025 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313216 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313834 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313938 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313936 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.313985 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jjng6" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.314140 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.322845 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.463790 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464144 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464211 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464266 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464296 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464319 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464432 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wschd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-kube-api-access-wschd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464451 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.464498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566205 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566265 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566310 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566361 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wschd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-kube-api-access-wschd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566472 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.566651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.567799 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.568469 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.569013 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.569899 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.571736 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.572698 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.572745 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/856d33f0f69f6ddab83e3004572787ee6d83b55286f5e7938a47fe9f0c93e013/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.574544 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.575655 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.580937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.585591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wschd\" (UniqueName: \"kubernetes.io/projected/6fb3b204-2b5a-4dcb-a278-d58ea0dce557-kube-api-access-wschd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.627374 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16809d31-3801-4939-90b2-8372afe3cbea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16809d31-3801-4939-90b2-8372afe3cbea\") pod \"rabbitmq-cell1-server-0\" (UID: \"6fb3b204-2b5a-4dcb-a278-d58ea0dce557\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.643538 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.643593 4830 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.643696 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g97nt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4v4rm_openstack(51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:31:14 crc kubenswrapper[4830]: E1203 22:31:14.644830 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-4v4rm" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" Dec 03 22:31:14 crc kubenswrapper[4830]: I1203 22:31:14.927560 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.088300 4830 scope.go:117] "RemoveContainer" containerID="fea7ef4434389728b9f81d8aa6ccd5a9ac43c0d7acae08c6c3292151a1167cff" Dec 03 22:31:15 crc kubenswrapper[4830]: E1203 22:31:15.114735 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 03 22:31:15 crc kubenswrapper[4830]: E1203 22:31:15.115110 4830 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 03 22:31:15 crc kubenswrapper[4830]: E1203 22:31:15.115324 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54dh68h566hc6h694h58fhc4h98h5d6h699h669h8bh656h64h58fh57dh668h577h5cch664h55fh5c5h5fdhbch67dhd7h7ch676h9bh595h699hd9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9bh7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.192007 4830 scope.go:117] "RemoveContainer" containerID="accd5ab84c124d6c490627992a78ee18f80188c66c763bb70f6a9af31dbbe222" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.308478 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: i/o timeout" Dec 03 22:31:15 crc kubenswrapper[4830]: E1203 22:31:15.428249 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-4v4rm" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.459051 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc13f96-b9cf-4e92-bbe6-2c3719041e59" path="/var/lib/kubelet/pods/3fc13f96-b9cf-4e92-bbe6-2c3719041e59/volumes" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.529890 4830 scope.go:117] "RemoveContainer" containerID="b770f5ff20096e432c2f76473a1a5de1ea08c80b6cf3d629c1285e3a33deff49" Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.857567 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:15 crc kubenswrapper[4830]: I1203 22:31:15.944897 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:31:15 crc kubenswrapper[4830]: W1203 22:31:15.953273 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod034a9c41_0e84_4351_9483_453227579bda.slice/crio-696e4bf1b0539acec3195d1ff3c286250e6c7189592419351aec80f53aafab29 WatchSource:0}: Error finding container 696e4bf1b0539acec3195d1ff3c286250e6c7189592419351aec80f53aafab29: Status 404 returned error can't find the container with id 696e4bf1b0539acec3195d1ff3c286250e6c7189592419351aec80f53aafab29 Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.075330 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.082814 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:31:16 crc kubenswrapper[4830]: W1203 22:31:16.095614 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb3b204_2b5a_4dcb_a278_d58ea0dce557.slice/crio-5e483123094b77225803f0f6b955d47a7214f5855a477cf6b54fa9c5031f02f5 WatchSource:0}: Error finding container 5e483123094b77225803f0f6b955d47a7214f5855a477cf6b54fa9c5031f02f5: Status 404 returned error can't find the container with id 5e483123094b77225803f0f6b955d47a7214f5855a477cf6b54fa9c5031f02f5 Dec 03 22:31:16 crc kubenswrapper[4830]: W1203 22:31:16.097912 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2c4f61_6b61_4907_8601_6eea8065d2f6.slice/crio-8edb1116f0f4e61a34d8a4bf138a8dcd107df2d607bf823f85000ecc4a6bfeb0 WatchSource:0}: Error finding container 8edb1116f0f4e61a34d8a4bf138a8dcd107df2d607bf823f85000ecc4a6bfeb0: Status 404 returned error can't find the container with id 8edb1116f0f4e61a34d8a4bf138a8dcd107df2d607bf823f85000ecc4a6bfeb0 Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.295975 4830 generic.go:334] "Generic (PLEG): container finished" podID="034a9c41-0e84-4351-9483-453227579bda" containerID="e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d" exitCode=0 Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.296077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerDied","Data":"e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.296122 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerStarted","Data":"696e4bf1b0539acec3195d1ff3c286250e6c7189592419351aec80f53aafab29"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.299652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5a2c4f61-6b61-4907-8601-6eea8065d2f6","Type":"ContainerStarted","Data":"8edb1116f0f4e61a34d8a4bf138a8dcd107df2d607bf823f85000ecc4a6bfeb0"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.301525 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf","Type":"ContainerStarted","Data":"6110b91b544932a1456c702f0664215855eaba6325efe71b8af7e68180945d2a"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.304104 4830 generic.go:334] "Generic (PLEG): container finished" podID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerID="b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46" exitCode=0 Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.304151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" event={"ID":"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6","Type":"ContainerDied","Data":"b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.304169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" event={"ID":"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6","Type":"ContainerStarted","Data":"03499c1193f51b24859b40e0a7492c94b131f8f685aa6bf103a6b35588fb762d"} Dec 03 22:31:16 crc kubenswrapper[4830]: I1203 22:31:16.306722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6fb3b204-2b5a-4dcb-a278-d58ea0dce557","Type":"ContainerStarted","Data":"5e483123094b77225803f0f6b955d47a7214f5855a477cf6b54fa9c5031f02f5"} Dec 03 22:31:17 crc kubenswrapper[4830]: I1203 22:31:17.323966 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerStarted","Data":"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41"} Dec 03 22:31:17 crc kubenswrapper[4830]: I1203 22:31:17.327469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf","Type":"ContainerStarted","Data":"587d76d16c4bcfc380a70962ee4ef2c6437c58eea69a165fa60942e4603f3b0c"} Dec 03 22:31:17 crc kubenswrapper[4830]: I1203 22:31:17.330560 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" event={"ID":"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6","Type":"ContainerStarted","Data":"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff"} Dec 03 22:31:17 crc kubenswrapper[4830]: I1203 22:31:17.330719 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:17 crc kubenswrapper[4830]: I1203 22:31:17.373997 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" podStartSLOduration=13.373977377 podStartE2EDuration="13.373977377s" podCreationTimestamp="2025-12-03 22:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:17.369253519 +0000 UTC m=+1566.365714868" watchObservedRunningTime="2025-12-03 22:31:17.373977377 +0000 UTC m=+1566.370438726" Dec 03 22:31:18 crc kubenswrapper[4830]: I1203 22:31:18.343046 4830 generic.go:334] "Generic (PLEG): container finished" podID="034a9c41-0e84-4351-9483-453227579bda" containerID="e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41" exitCode=0 Dec 03 22:31:18 crc kubenswrapper[4830]: I1203 22:31:18.343125 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerDied","Data":"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41"} Dec 03 22:31:18 crc kubenswrapper[4830]: I1203 22:31:18.356540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6fb3b204-2b5a-4dcb-a278-d58ea0dce557","Type":"ContainerStarted","Data":"67ffa2d15cfcc47d2f4802ad7cc2bba4714d99713d0174eb0626d5d9bdfd99a0"} Dec 03 22:31:19 crc kubenswrapper[4830]: I1203 22:31:19.184393 4830 scope.go:117] "RemoveContainer" containerID="3e0f9ecf29bdb81ba42134b6cbadcbf1f7b6307b02fe467611e6883fe907b3d7" Dec 03 22:31:19 crc kubenswrapper[4830]: I1203 22:31:19.369016 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5a2c4f61-6b61-4907-8601-6eea8065d2f6","Type":"ContainerStarted","Data":"5fe4a7a95e30e02d816c461f07ba934fbdf013c73d5fcc12dbc6fef802acca24"} Dec 03 22:31:21 crc kubenswrapper[4830]: E1203 22:31:21.269421 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf" Dec 03 22:31:21 crc kubenswrapper[4830]: I1203 22:31:21.405268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerStarted","Data":"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d"} Dec 03 22:31:21 crc kubenswrapper[4830]: I1203 22:31:21.408450 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf","Type":"ContainerStarted","Data":"2775731c613a3ca4a376d7a109ecd89f1b472a86a48aca2c2e1f636a5e4dfd8b"} Dec 03 22:31:21 crc kubenswrapper[4830]: I1203 22:31:21.408692 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:31:21 crc kubenswrapper[4830]: E1203 22:31:21.410875 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf" Dec 03 22:31:21 crc kubenswrapper[4830]: I1203 22:31:21.431285 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mh6hf" podStartSLOduration=17.873315756 podStartE2EDuration="22.431268598s" podCreationTimestamp="2025-12-03 22:30:59 +0000 UTC" firstStartedPulling="2025-12-03 22:31:16.298650274 +0000 UTC m=+1565.295111613" lastFinishedPulling="2025-12-03 22:31:20.856603096 +0000 UTC m=+1569.853064455" observedRunningTime="2025-12-03 22:31:21.429020147 +0000 UTC m=+1570.425481506" watchObservedRunningTime="2025-12-03 22:31:21.431268598 +0000 UTC m=+1570.427729947" Dec 03 22:31:22 crc kubenswrapper[4830]: E1203 22:31:22.418942 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.205822 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.294972 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.295211 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="dnsmasq-dns" containerID="cri-o://1e3d5f5bfde74bbd04a9649be99a7dc4d5279ba86303e292f6ac3f7ec8a2517c" gracePeriod=10 Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.475260 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pzsz8"] Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.477128 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.477658 4830 generic.go:334] "Generic (PLEG): container finished" podID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerID="1e3d5f5bfde74bbd04a9649be99a7dc4d5279ba86303e292f6ac3f7ec8a2517c" exitCode=0 Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.477695 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" event={"ID":"720ca8ef-d526-423c-a334-4e6a771a8e6e","Type":"ContainerDied","Data":"1e3d5f5bfde74bbd04a9649be99a7dc4d5279ba86303e292f6ac3f7ec8a2517c"} Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.483816 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pzsz8"] Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552004 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4zq\" (UniqueName: \"kubernetes.io/projected/f5a1439e-ab65-4263-bbfc-09933a4db924-kube-api-access-rc4zq\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552046 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552081 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552127 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-config\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552159 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.552775 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659786 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4zq\" (UniqueName: \"kubernetes.io/projected/f5a1439e-ab65-4263-bbfc-09933a4db924-kube-api-access-rc4zq\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.659906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-config\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.660790 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-config\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.661433 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.662083 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.662617 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.663103 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.663831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a1439e-ab65-4263-bbfc-09933a4db924-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.718632 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4zq\" (UniqueName: \"kubernetes.io/projected/f5a1439e-ab65-4263-bbfc-09933a4db924-kube-api-access-rc4zq\") pod \"dnsmasq-dns-85f64749dc-pzsz8\" (UID: \"f5a1439e-ab65-4263-bbfc-09933a4db924\") " pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:25 crc kubenswrapper[4830]: I1203 22:31:25.819978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.094328 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168363 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168578 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168748 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm28x\" (UniqueName: \"kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168785 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.168843 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb\") pod \"720ca8ef-d526-423c-a334-4e6a771a8e6e\" (UID: \"720ca8ef-d526-423c-a334-4e6a771a8e6e\") " Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.176280 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x" (OuterVolumeSpecName: "kube-api-access-nm28x") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "kube-api-access-nm28x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.245207 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.246572 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.271315 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.271356 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.271367 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm28x\" (UniqueName: \"kubernetes.io/projected/720ca8ef-d526-423c-a334-4e6a771a8e6e-kube-api-access-nm28x\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.272148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.276250 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.277630 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config" (OuterVolumeSpecName: "config") pod "720ca8ef-d526-423c-a334-4e6a771a8e6e" (UID: "720ca8ef-d526-423c-a334-4e6a771a8e6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.373663 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.373708 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.373722 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/720ca8ef-d526-423c-a334-4e6a771a8e6e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.496912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" event={"ID":"720ca8ef-d526-423c-a334-4e6a771a8e6e","Type":"ContainerDied","Data":"e2cc6703772ce875e8914d9cfb12f6dd49371afec3b97ec7af0a5b9d8e7f782d"} Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.496989 4830 scope.go:117] "RemoveContainer" containerID="1e3d5f5bfde74bbd04a9649be99a7dc4d5279ba86303e292f6ac3f7ec8a2517c" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.497293 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-j9jwq" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.540005 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pzsz8"] Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.552014 4830 scope.go:117] "RemoveContainer" containerID="ada712e13a46cd0510bb004aa90abfd7c51b104f464386e9032c4533cb099365" Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.559705 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:31:26 crc kubenswrapper[4830]: W1203 22:31:26.560522 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a1439e_ab65_4263_bbfc_09933a4db924.slice/crio-e11bb2cc13272c83c47ad067146c94805dbe47a19dd3709d49d55cce6d7c6268 WatchSource:0}: Error finding container e11bb2cc13272c83c47ad067146c94805dbe47a19dd3709d49d55cce6d7c6268: Status 404 returned error can't find the container with id e11bb2cc13272c83c47ad067146c94805dbe47a19dd3709d49d55cce6d7c6268 Dec 03 22:31:26 crc kubenswrapper[4830]: I1203 22:31:26.569004 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-j9jwq"] Dec 03 22:31:27 crc kubenswrapper[4830]: I1203 22:31:27.351293 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" path="/var/lib/kubelet/pods/720ca8ef-d526-423c-a334-4e6a771a8e6e/volumes" Dec 03 22:31:27 crc kubenswrapper[4830]: I1203 22:31:27.526194 4830 generic.go:334] "Generic (PLEG): container finished" podID="f5a1439e-ab65-4263-bbfc-09933a4db924" containerID="474a139bfd650ec8a76a8ccb08cdec918b84443961e2b86ca2b736180d0aeec5" exitCode=0 Dec 03 22:31:27 crc kubenswrapper[4830]: I1203 22:31:27.526289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" event={"ID":"f5a1439e-ab65-4263-bbfc-09933a4db924","Type":"ContainerDied","Data":"474a139bfd650ec8a76a8ccb08cdec918b84443961e2b86ca2b736180d0aeec5"} Dec 03 22:31:27 crc kubenswrapper[4830]: I1203 22:31:27.526334 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" event={"ID":"f5a1439e-ab65-4263-bbfc-09933a4db924","Type":"ContainerStarted","Data":"e11bb2cc13272c83c47ad067146c94805dbe47a19dd3709d49d55cce6d7c6268"} Dec 03 22:31:28 crc kubenswrapper[4830]: I1203 22:31:28.538889 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" event={"ID":"f5a1439e-ab65-4263-bbfc-09933a4db924","Type":"ContainerStarted","Data":"2eac84ccc2cfcc93aec38e9f81a22d0c53bc382c193e0c83b13134a9a9e73822"} Dec 03 22:31:28 crc kubenswrapper[4830]: I1203 22:31:28.540069 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:28 crc kubenswrapper[4830]: I1203 22:31:28.566613 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" podStartSLOduration=3.566594266 podStartE2EDuration="3.566594266s" podCreationTimestamp="2025-12-03 22:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:28.55786064 +0000 UTC m=+1577.554321979" watchObservedRunningTime="2025-12-03 22:31:28.566594266 +0000 UTC m=+1577.563055615" Dec 03 22:31:29 crc kubenswrapper[4830]: I1203 22:31:29.509171 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.154114 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.154710 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.247768 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.559940 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4v4rm" event={"ID":"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6","Type":"ContainerStarted","Data":"f5f2d638b620b0ad73387d7243395a60174a6ea21f7a5148da06db4ace444741"} Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.594432 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-4v4rm" podStartSLOduration=2.194327977 podStartE2EDuration="41.594412579s" podCreationTimestamp="2025-12-03 22:30:49 +0000 UTC" firstStartedPulling="2025-12-03 22:30:50.10592251 +0000 UTC m=+1539.102383869" lastFinishedPulling="2025-12-03 22:31:29.506007122 +0000 UTC m=+1578.502468471" observedRunningTime="2025-12-03 22:31:30.58667078 +0000 UTC m=+1579.583132179" watchObservedRunningTime="2025-12-03 22:31:30.594412579 +0000 UTC m=+1579.590873938" Dec 03 22:31:30 crc kubenswrapper[4830]: I1203 22:31:30.618309 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:31 crc kubenswrapper[4830]: I1203 22:31:31.020432 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:31:32 crc kubenswrapper[4830]: I1203 22:31:32.582716 4830 generic.go:334] "Generic (PLEG): container finished" podID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" containerID="f5f2d638b620b0ad73387d7243395a60174a6ea21f7a5148da06db4ace444741" exitCode=0 Dec 03 22:31:32 crc kubenswrapper[4830]: I1203 22:31:32.583459 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mh6hf" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="registry-server" containerID="cri-o://80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d" gracePeriod=2 Dec 03 22:31:32 crc kubenswrapper[4830]: I1203 22:31:32.582890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4v4rm" event={"ID":"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6","Type":"ContainerDied","Data":"f5f2d638b620b0ad73387d7243395a60174a6ea21f7a5148da06db4ace444741"} Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.237935 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.367222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content\") pod \"034a9c41-0e84-4351-9483-453227579bda\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.367311 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities\") pod \"034a9c41-0e84-4351-9483-453227579bda\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.367524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7st47\" (UniqueName: \"kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47\") pod \"034a9c41-0e84-4351-9483-453227579bda\" (UID: \"034a9c41-0e84-4351-9483-453227579bda\") " Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.368232 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities" (OuterVolumeSpecName: "utilities") pod "034a9c41-0e84-4351-9483-453227579bda" (UID: "034a9c41-0e84-4351-9483-453227579bda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.376789 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47" (OuterVolumeSpecName: "kube-api-access-7st47") pod "034a9c41-0e84-4351-9483-453227579bda" (UID: "034a9c41-0e84-4351-9483-453227579bda"). InnerVolumeSpecName "kube-api-access-7st47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.427573 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "034a9c41-0e84-4351-9483-453227579bda" (UID: "034a9c41-0e84-4351-9483-453227579bda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.470321 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7st47\" (UniqueName: \"kubernetes.io/projected/034a9c41-0e84-4351-9483-453227579bda-kube-api-access-7st47\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.470350 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.470360 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/034a9c41-0e84-4351-9483-453227579bda-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.595269 4830 generic.go:334] "Generic (PLEG): container finished" podID="034a9c41-0e84-4351-9483-453227579bda" containerID="80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d" exitCode=0 Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.595367 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerDied","Data":"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d"} Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.595408 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh6hf" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.595441 4830 scope.go:117] "RemoveContainer" containerID="80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.595427 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh6hf" event={"ID":"034a9c41-0e84-4351-9483-453227579bda","Type":"ContainerDied","Data":"696e4bf1b0539acec3195d1ff3c286250e6c7189592419351aec80f53aafab29"} Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.635197 4830 scope.go:117] "RemoveContainer" containerID="e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.642662 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.653133 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mh6hf"] Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.658099 4830 scope.go:117] "RemoveContainer" containerID="e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.722388 4830 scope.go:117] "RemoveContainer" containerID="80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d" Dec 03 22:31:33 crc kubenswrapper[4830]: E1203 22:31:33.724764 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d\": container with ID starting with 80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d not found: ID does not exist" containerID="80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.724798 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d"} err="failed to get container status \"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d\": rpc error: code = NotFound desc = could not find container \"80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d\": container with ID starting with 80a44c53dc9d83ef7c5357800ae10da7bb726d73b55f91d4a81cbf44dfddc53d not found: ID does not exist" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.724843 4830 scope.go:117] "RemoveContainer" containerID="e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41" Dec 03 22:31:33 crc kubenswrapper[4830]: E1203 22:31:33.730168 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41\": container with ID starting with e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41 not found: ID does not exist" containerID="e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.730226 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41"} err="failed to get container status \"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41\": rpc error: code = NotFound desc = could not find container \"e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41\": container with ID starting with e64c6c0f1d8c406662ff9061df1a4ece4bafc063917bef84c5834a84f759dc41 not found: ID does not exist" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.730256 4830 scope.go:117] "RemoveContainer" containerID="e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d" Dec 03 22:31:33 crc kubenswrapper[4830]: E1203 22:31:33.731837 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d\": container with ID starting with e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d not found: ID does not exist" containerID="e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d" Dec 03 22:31:33 crc kubenswrapper[4830]: I1203 22:31:33.731885 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d"} err="failed to get container status \"e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d\": rpc error: code = NotFound desc = could not find container \"e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d\": container with ID starting with e7a301a2342151cdf6644eaa735ba718818e2ec4b8d32440e695e2de26efb05d not found: ID does not exist" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.050894 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.184651 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts\") pod \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.184731 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle\") pod \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.184784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs\") pod \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.184840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g97nt\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt\") pod \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.184964 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data\") pod \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\" (UID: \"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6\") " Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.189318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt" (OuterVolumeSpecName: "kube-api-access-g97nt") pod "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" (UID: "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6"). InnerVolumeSpecName "kube-api-access-g97nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.189395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs" (OuterVolumeSpecName: "certs") pod "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" (UID: "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.190054 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts" (OuterVolumeSpecName: "scripts") pod "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" (UID: "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.221484 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" (UID: "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.237363 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data" (OuterVolumeSpecName: "config-data") pod "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" (UID: "51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.288148 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.288190 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g97nt\" (UniqueName: \"kubernetes.io/projected/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-kube-api-access-g97nt\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.288204 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.288215 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.288224 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.637954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4v4rm" event={"ID":"51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6","Type":"ContainerDied","Data":"448af7412e86918b5e76553acca326e6a0ef587b74745778dde2777faa440ca3"} Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.638263 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448af7412e86918b5e76553acca326e6a0ef587b74745778dde2777faa440ca3" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.638107 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4v4rm" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.714365 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-hvnzw"] Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.724630 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-hvnzw"] Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.840488 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-69q52"] Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850088 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="extract-utilities" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850152 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="extract-utilities" Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850180 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="registry-server" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850190 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="registry-server" Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850207 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="init" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850217 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="init" Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850232 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" containerName="cloudkitty-db-sync" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850240 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" containerName="cloudkitty-db-sync" Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850261 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="extract-content" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850269 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="extract-content" Dec 03 22:31:34 crc kubenswrapper[4830]: E1203 22:31:34.850298 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="dnsmasq-dns" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850307 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="dnsmasq-dns" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850622 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="720ca8ef-d526-423c-a334-4e6a771a8e6e" containerName="dnsmasq-dns" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850645 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" containerName="cloudkitty-db-sync" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.850674 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="034a9c41-0e84-4351-9483-453227579bda" containerName="registry-server" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.851704 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.863854 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:31:34 crc kubenswrapper[4830]: I1203 22:31:34.865770 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-69q52"] Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.007465 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.007547 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.007607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.007639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86pp8\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.007783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.109993 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.110063 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.110099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.110123 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86pp8\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.110179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.115065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.115275 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.115716 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.116904 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.127230 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86pp8\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8\") pod \"cloudkitty-storageinit-69q52\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.190272 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.362456 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="034a9c41-0e84-4351-9483-453227579bda" path="/var/lib/kubelet/pods/034a9c41-0e84-4351-9483-453227579bda/volumes" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.363796 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4d391d-0a89-417b-b548-b4754e4dcc99" path="/var/lib/kubelet/pods/5a4d391d-0a89-417b-b548-b4754e4dcc99/volumes" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.364572 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.681829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-69q52"] Dec 03 22:31:35 crc kubenswrapper[4830]: W1203 22:31:35.687015 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf517ffb5_a85a_4683_b721_ef130df773dc.slice/crio-c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c WatchSource:0}: Error finding container c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c: Status 404 returned error can't find the container with id c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.822327 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-pzsz8" Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.889373 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:35 crc kubenswrapper[4830]: I1203 22:31:35.889686 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="dnsmasq-dns" containerID="cri-o://4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff" gracePeriod=10 Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.646318 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.660121 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-69q52" event={"ID":"f517ffb5-a85a-4683-b721-ef130df773dc","Type":"ContainerStarted","Data":"67bad5d47240553a792fe38cbf37fa74ae64a93185be9de7e9fb4c044abfb63e"} Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.660171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-69q52" event={"ID":"f517ffb5-a85a-4683-b721-ef130df773dc","Type":"ContainerStarted","Data":"c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c"} Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.663744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf","Type":"ContainerStarted","Data":"69bc7b1a43d9f4c0e77dbbdae12c89305a0211f8340b2f50d797a6132981a50b"} Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.665288 4830 generic.go:334] "Generic (PLEG): container finished" podID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerID="4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff" exitCode=0 Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.665329 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" event={"ID":"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6","Type":"ContainerDied","Data":"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff"} Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.665353 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" event={"ID":"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6","Type":"ContainerDied","Data":"03499c1193f51b24859b40e0a7492c94b131f8f685aa6bf103a6b35588fb762d"} Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.665329 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-8zh8s" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.665371 4830 scope.go:117] "RemoveContainer" containerID="4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.692613 4830 scope.go:117] "RemoveContainer" containerID="b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.729540 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.26332155 podStartE2EDuration="43.729519496s" podCreationTimestamp="2025-12-03 22:30:53 +0000 UTC" firstStartedPulling="2025-12-03 22:30:55.151846139 +0000 UTC m=+1544.148307488" lastFinishedPulling="2025-12-03 22:31:35.618044055 +0000 UTC m=+1584.614505434" observedRunningTime="2025-12-03 22:31:36.719211417 +0000 UTC m=+1585.715672776" watchObservedRunningTime="2025-12-03 22:31:36.729519496 +0000 UTC m=+1585.725980835" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.732265 4830 scope.go:117] "RemoveContainer" containerID="4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff" Dec 03 22:31:36 crc kubenswrapper[4830]: E1203 22:31:36.732888 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff\": container with ID starting with 4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff not found: ID does not exist" containerID="4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.732925 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff"} err="failed to get container status \"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff\": rpc error: code = NotFound desc = could not find container \"4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff\": container with ID starting with 4fdcf21f6b891ca7cba43740fd1856d8e6fc0c8bde309b2540df33795cb3fcff not found: ID does not exist" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.732950 4830 scope.go:117] "RemoveContainer" containerID="b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46" Dec 03 22:31:36 crc kubenswrapper[4830]: E1203 22:31:36.733243 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46\": container with ID starting with b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46 not found: ID does not exist" containerID="b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.733284 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46"} err="failed to get container status \"b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46\": rpc error: code = NotFound desc = could not find container \"b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46\": container with ID starting with b5d3394e52783e49c337d2ff8592059e08a12b6143c3fe47d0d5b9b2e35dba46 not found: ID does not exist" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.754777 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.754936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.754986 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgrc\" (UniqueName: \"kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.755376 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.758068 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.758184 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.758223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb\") pod \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\" (UID: \"cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6\") " Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.760077 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-69q52" podStartSLOduration=2.760060942 podStartE2EDuration="2.760060942s" podCreationTimestamp="2025-12-03 22:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:36.753916356 +0000 UTC m=+1585.750377715" watchObservedRunningTime="2025-12-03 22:31:36.760060942 +0000 UTC m=+1585.756522291" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.765432 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc" (OuterVolumeSpecName: "kube-api-access-lmgrc") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "kube-api-access-lmgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.855497 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config" (OuterVolumeSpecName: "config") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.862743 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-config\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.862777 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmgrc\" (UniqueName: \"kubernetes.io/projected/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-kube-api-access-lmgrc\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.872017 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.878820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.888382 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.907976 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.910972 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" (UID: "cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.964353 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.964390 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.964404 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.964412 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:36 crc kubenswrapper[4830]: I1203 22:31:36.964421 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:37 crc kubenswrapper[4830]: I1203 22:31:37.003093 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:37 crc kubenswrapper[4830]: I1203 22:31:37.015203 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-8zh8s"] Dec 03 22:31:37 crc kubenswrapper[4830]: I1203 22:31:37.349140 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" path="/var/lib/kubelet/pods/cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6/volumes" Dec 03 22:31:37 crc kubenswrapper[4830]: I1203 22:31:37.693612 4830 generic.go:334] "Generic (PLEG): container finished" podID="f517ffb5-a85a-4683-b721-ef130df773dc" containerID="67bad5d47240553a792fe38cbf37fa74ae64a93185be9de7e9fb4c044abfb63e" exitCode=0 Dec 03 22:31:37 crc kubenswrapper[4830]: I1203 22:31:37.693749 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-69q52" event={"ID":"f517ffb5-a85a-4683-b721-ef130df773dc","Type":"ContainerDied","Data":"67bad5d47240553a792fe38cbf37fa74ae64a93185be9de7e9fb4c044abfb63e"} Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.183887 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.328852 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data\") pod \"f517ffb5-a85a-4683-b721-ef130df773dc\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.329176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts\") pod \"f517ffb5-a85a-4683-b721-ef130df773dc\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.329227 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs\") pod \"f517ffb5-a85a-4683-b721-ef130df773dc\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.329975 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86pp8\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8\") pod \"f517ffb5-a85a-4683-b721-ef130df773dc\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.330019 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle\") pod \"f517ffb5-a85a-4683-b721-ef130df773dc\" (UID: \"f517ffb5-a85a-4683-b721-ef130df773dc\") " Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.335977 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts" (OuterVolumeSpecName: "scripts") pod "f517ffb5-a85a-4683-b721-ef130df773dc" (UID: "f517ffb5-a85a-4683-b721-ef130df773dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.337836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8" (OuterVolumeSpecName: "kube-api-access-86pp8") pod "f517ffb5-a85a-4683-b721-ef130df773dc" (UID: "f517ffb5-a85a-4683-b721-ef130df773dc"). InnerVolumeSpecName "kube-api-access-86pp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.337926 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs" (OuterVolumeSpecName: "certs") pod "f517ffb5-a85a-4683-b721-ef130df773dc" (UID: "f517ffb5-a85a-4683-b721-ef130df773dc"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.368158 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data" (OuterVolumeSpecName: "config-data") pod "f517ffb5-a85a-4683-b721-ef130df773dc" (UID: "f517ffb5-a85a-4683-b721-ef130df773dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.370440 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f517ffb5-a85a-4683-b721-ef130df773dc" (UID: "f517ffb5-a85a-4683-b721-ef130df773dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.432531 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86pp8\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-kube-api-access-86pp8\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.432573 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.432585 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.432596 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f517ffb5-a85a-4683-b721-ef130df773dc-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.432607 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f517ffb5-a85a-4683-b721-ef130df773dc-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.716307 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-69q52" event={"ID":"f517ffb5-a85a-4683-b721-ef130df773dc","Type":"ContainerDied","Data":"c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c"} Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.716346 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c58f8103d26fe3451b8739edf10628070912b68ff01c4b80d8a9f2e5aa65909c" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.716393 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-69q52" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.813467 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:39 crc kubenswrapper[4830]: E1203 22:31:39.813933 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f517ffb5-a85a-4683-b721-ef130df773dc" containerName="cloudkitty-storageinit" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.813952 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f517ffb5-a85a-4683-b721-ef130df773dc" containerName="cloudkitty-storageinit" Dec 03 22:31:39 crc kubenswrapper[4830]: E1203 22:31:39.813971 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="init" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.813977 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="init" Dec 03 22:31:39 crc kubenswrapper[4830]: E1203 22:31:39.813989 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="dnsmasq-dns" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.813996 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="dnsmasq-dns" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.814219 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef9a5d9-4bfb-40ab-aa22-0d8d1cbebbb6" containerName="dnsmasq-dns" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.814250 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f517ffb5-a85a-4683-b721-ef130df773dc" containerName="cloudkitty-storageinit" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.815726 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.825078 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:39 crc kubenswrapper[4830]: E1203 22:31:39.848553 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf517ffb5_a85a_4683_b721_ef130df773dc.slice\": RecentStats: unable to find data in memory cache]" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.942193 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.942474 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh527\" (UniqueName: \"kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:39 crc kubenswrapper[4830]: I1203 22:31:39.942669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.044350 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh527\" (UniqueName: \"kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.044710 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.044809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.045486 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.045536 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.061416 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh527\" (UniqueName: \"kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527\") pod \"redhat-marketplace-dt49n\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.166491 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.300908 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.301206 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" containerName="cloudkitty-proc" containerID="cri-o://bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f" gracePeriod=30 Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.314128 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.314567 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api-log" containerID="cri-o://2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129" gracePeriod=30 Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.314645 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api" containerID="cri-o://57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563" gracePeriod=30 Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.729302 4830 generic.go:334] "Generic (PLEG): container finished" podID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerID="2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129" exitCode=143 Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.729387 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerDied","Data":"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129"} Dec 03 22:31:40 crc kubenswrapper[4830]: I1203 22:31:40.790057 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:40 crc kubenswrapper[4830]: W1203 22:31:40.796898 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbd4ff2_fae6_4680_8017_8d64403b9f03.slice/crio-dc24c7df2aa8eab269cda10bcd327d994adc5bc4b05348920bdfd55e35c58580 WatchSource:0}: Error finding container dc24c7df2aa8eab269cda10bcd327d994adc5bc4b05348920bdfd55e35c58580: Status 404 returned error can't find the container with id dc24c7df2aa8eab269cda10bcd327d994adc5bc4b05348920bdfd55e35c58580 Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.681245 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.689003 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.749359 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" containerID="bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f" exitCode=0 Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.749443 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192","Type":"ContainerDied","Data":"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.749478 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192","Type":"ContainerDied","Data":"40b3cc210fac68d172ad919ba131e9593a1e6659f030a67e2e5a7d943e900512"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.749498 4830 scope.go:117] "RemoveContainer" containerID="bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.749659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.755180 4830 generic.go:334] "Generic (PLEG): container finished" podID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerID="e93fbef31bbf4f44c443f2675bfa6400e7000052a76350095eef2d2986e26ad3" exitCode=0 Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.755308 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerDied","Data":"e93fbef31bbf4f44c443f2675bfa6400e7000052a76350095eef2d2986e26ad3"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.755335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerStarted","Data":"dc24c7df2aa8eab269cda10bcd327d994adc5bc4b05348920bdfd55e35c58580"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.759167 4830 generic.go:334] "Generic (PLEG): container finished" podID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerID="57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563" exitCode=0 Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.759203 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerDied","Data":"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.759229 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae","Type":"ContainerDied","Data":"fe5b7a4c77e70cdb61062f2b6d3965e1d41a849aa2695f9a2beb627a3c763657"} Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.759281 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803431 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803472 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803560 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803609 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803687 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803725 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6c54\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803766 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803797 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803825 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803891 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle\") pod \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\" (UID: \"5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803942 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.803995 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkzgn\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn\") pod \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\" (UID: \"6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae\") " Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.804403 4830 scope.go:117] "RemoveContainer" containerID="bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f" Dec 03 22:31:41 crc kubenswrapper[4830]: E1203 22:31:41.806668 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f\": container with ID starting with bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f not found: ID does not exist" containerID="bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.806720 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f"} err="failed to get container status \"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f\": rpc error: code = NotFound desc = could not find container \"bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f\": container with ID starting with bed162911524b8bd8e562458933f5d948a397c439a8e28dca16ffc5fb86dcc6f not found: ID does not exist" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.806746 4830 scope.go:117] "RemoveContainer" containerID="57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.807178 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs" (OuterVolumeSpecName: "logs") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.809214 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts" (OuterVolumeSpecName: "scripts") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.813343 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.813953 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts" (OuterVolumeSpecName: "scripts") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.814062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54" (OuterVolumeSpecName: "kube-api-access-c6c54") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "kube-api-access-c6c54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.814120 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn" (OuterVolumeSpecName: "kube-api-access-vkzgn") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "kube-api-access-vkzgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.814863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.820903 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs" (OuterVolumeSpecName: "certs") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.829734 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs" (OuterVolumeSpecName: "certs") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.845155 4830 scope.go:117] "RemoveContainer" containerID="2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.869788 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.880325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data" (OuterVolumeSpecName: "config-data") pod "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" (UID: "5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.880378 4830 scope.go:117] "RemoveContainer" containerID="57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.880487 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: E1203 22:31:41.881058 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563\": container with ID starting with 57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563 not found: ID does not exist" containerID="57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.881099 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563"} err="failed to get container status \"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563\": rpc error: code = NotFound desc = could not find container \"57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563\": container with ID starting with 57c13d662cd3d6cf16d61c6090bbf1ae3e710a5d42444cb6ce11c95305e52563 not found: ID does not exist" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.881127 4830 scope.go:117] "RemoveContainer" containerID="2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129" Dec 03 22:31:41 crc kubenswrapper[4830]: E1203 22:31:41.884612 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129\": container with ID starting with 2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129 not found: ID does not exist" containerID="2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.884646 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129"} err="failed to get container status \"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129\": rpc error: code = NotFound desc = could not find container \"2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129\": container with ID starting with 2464815b40f41a601ed6eefa2af8cc5a58ef3c454d61824b6331bf4ef90ef129 not found: ID does not exist" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.905493 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data" (OuterVolumeSpecName: "config-data") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906280 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906376 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906458 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6c54\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-kube-api-access-c6c54\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906544 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906766 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.906913 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907018 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907095 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkzgn\" (UniqueName: \"kubernetes.io/projected/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-kube-api-access-vkzgn\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907171 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907696 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907777 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-logs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907870 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.907928 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.920698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:41 crc kubenswrapper[4830]: I1203 22:31:41.921707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" (UID: "6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.010269 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.010309 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.094318 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.118675 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.133995 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.143368 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.156653 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: E1203 22:31:42.157119 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157136 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api" Dec 03 22:31:42 crc kubenswrapper[4830]: E1203 22:31:42.157156 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api-log" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157163 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api-log" Dec 03 22:31:42 crc kubenswrapper[4830]: E1203 22:31:42.157198 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" containerName="cloudkitty-proc" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157205 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" containerName="cloudkitty-proc" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157394 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api-log" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157412 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" containerName="cloudkitty-api" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.157426 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" containerName="cloudkitty-proc" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.158166 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.163676 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.163726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.163994 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-pzjnl" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.164693 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.164717 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.167425 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.179256 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.180977 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.184432 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.184639 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.185181 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.194166 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.315855 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-scripts\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.315914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfn6\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-kube-api-access-qdfn6\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.315948 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.315984 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316022 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316132 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cd90ac-c295-404a-afa5-d2977c397561-logs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316168 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316244 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kshm8\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-kube-api-access-kshm8\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316311 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-certs\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.316345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418100 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418175 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kshm8\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-kube-api-access-kshm8\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418223 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-certs\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418309 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-scripts\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418325 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdfn6\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-kube-api-access-qdfn6\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418345 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418369 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418401 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418428 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418478 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cd90ac-c295-404a-afa5-d2977c397561-logs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418522 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.418566 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.420755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74cd90ac-c295-404a-afa5-d2977c397561-logs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.424575 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.425276 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.425997 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.427443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-scripts\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.428029 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.428254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-certs\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.429103 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-config-data\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.435380 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-certs\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.436267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.440480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-scripts\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.440704 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74cd90ac-c295-404a-afa5-d2977c397561-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.441410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be5d9e4-3136-4c97-89ef-9376c1ef588c-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.441750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdfn6\" (UniqueName: \"kubernetes.io/projected/74cd90ac-c295-404a-afa5-d2977c397561-kube-api-access-qdfn6\") pod \"cloudkitty-api-0\" (UID: \"74cd90ac-c295-404a-afa5-d2977c397561\") " pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.442221 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kshm8\" (UniqueName: \"kubernetes.io/projected/6be5d9e4-3136-4c97-89ef-9376c1ef588c-kube-api-access-kshm8\") pod \"cloudkitty-proc-0\" (UID: \"6be5d9e4-3136-4c97-89ef-9376c1ef588c\") " pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.478591 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.568171 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Dec 03 22:31:42 crc kubenswrapper[4830]: I1203 22:31:42.806328 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerStarted","Data":"ff77bb94ef0466777a9340455548b79dcc105a967397771323195617fa112a23"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.032704 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Dec 03 22:31:43 crc kubenswrapper[4830]: W1203 22:31:43.034894 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6be5d9e4_3136_4c97_89ef_9376c1ef588c.slice/crio-2c396510db82f844559af2930f856383fe88d4704ac23c0438959e94751fe45f WatchSource:0}: Error finding container 2c396510db82f844559af2930f856383fe88d4704ac23c0438959e94751fe45f: Status 404 returned error can't find the container with id 2c396510db82f844559af2930f856383fe88d4704ac23c0438959e94751fe45f Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.191386 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Dec 03 22:31:43 crc kubenswrapper[4830]: W1203 22:31:43.195124 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74cd90ac_c295_404a_afa5_d2977c397561.slice/crio-8d57c41bbd7eba111f263b23368db4b2fe902e41137e26a5eba92ccd540fa80e WatchSource:0}: Error finding container 8d57c41bbd7eba111f263b23368db4b2fe902e41137e26a5eba92ccd540fa80e: Status 404 returned error can't find the container with id 8d57c41bbd7eba111f263b23368db4b2fe902e41137e26a5eba92ccd540fa80e Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.354250 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192" path="/var/lib/kubelet/pods/5e3e33b6-b4bb-44c0-9ac1-f6c75b2c9192/volumes" Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.355408 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae" path="/var/lib/kubelet/pods/6fe6875f-e952-4bbf-bf7f-17c25ab2a4ae/volumes" Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.822317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"74cd90ac-c295-404a-afa5-d2977c397561","Type":"ContainerStarted","Data":"992c8d6245a1f97016973fbf3b0e1e46181e6dbfea7e6bce5f0bbb813fd507d4"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.822867 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.822923 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"74cd90ac-c295-404a-afa5-d2977c397561","Type":"ContainerStarted","Data":"da9bad84536bdf784b9543af218a38b289f6ba07b5b4e49366d5b52f606d0b0e"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.822945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"74cd90ac-c295-404a-afa5-d2977c397561","Type":"ContainerStarted","Data":"8d57c41bbd7eba111f263b23368db4b2fe902e41137e26a5eba92ccd540fa80e"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.823918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6be5d9e4-3136-4c97-89ef-9376c1ef588c","Type":"ContainerStarted","Data":"2c396510db82f844559af2930f856383fe88d4704ac23c0438959e94751fe45f"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.826411 4830 generic.go:334] "Generic (PLEG): container finished" podID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerID="ff77bb94ef0466777a9340455548b79dcc105a967397771323195617fa112a23" exitCode=0 Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.826448 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerDied","Data":"ff77bb94ef0466777a9340455548b79dcc105a967397771323195617fa112a23"} Dec 03 22:31:43 crc kubenswrapper[4830]: I1203 22:31:43.843625 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=1.8436035689999999 podStartE2EDuration="1.843603569s" podCreationTimestamp="2025-12-03 22:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:43.840096374 +0000 UTC m=+1592.836557723" watchObservedRunningTime="2025-12-03 22:31:43.843603569 +0000 UTC m=+1592.840064918" Dec 03 22:31:44 crc kubenswrapper[4830]: I1203 22:31:44.840099 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"6be5d9e4-3136-4c97-89ef-9376c1ef588c","Type":"ContainerStarted","Data":"f84733be860c9a36adfcbc4f297eb9d1aa33a5629ec02e7efa44b1e4452bdbdb"} Dec 03 22:31:44 crc kubenswrapper[4830]: I1203 22:31:44.847123 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerStarted","Data":"c5ba93bff7c646c64716e43ca6d808e81e2abd1424916e5a121bf20719db4d8c"} Dec 03 22:31:44 crc kubenswrapper[4830]: I1203 22:31:44.872200 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=1.920629723 podStartE2EDuration="2.872181588s" podCreationTimestamp="2025-12-03 22:31:42 +0000 UTC" firstStartedPulling="2025-12-03 22:31:43.037049056 +0000 UTC m=+1592.033510405" lastFinishedPulling="2025-12-03 22:31:43.988600911 +0000 UTC m=+1592.985062270" observedRunningTime="2025-12-03 22:31:44.856615727 +0000 UTC m=+1593.853077076" watchObservedRunningTime="2025-12-03 22:31:44.872181588 +0000 UTC m=+1593.868642937" Dec 03 22:31:44 crc kubenswrapper[4830]: I1203 22:31:44.879803 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dt49n" podStartSLOduration=3.328731989 podStartE2EDuration="5.879762053s" podCreationTimestamp="2025-12-03 22:31:39 +0000 UTC" firstStartedPulling="2025-12-03 22:31:41.757175971 +0000 UTC m=+1590.753637320" lastFinishedPulling="2025-12-03 22:31:44.308206035 +0000 UTC m=+1593.304667384" observedRunningTime="2025-12-03 22:31:44.871667944 +0000 UTC m=+1593.868129293" watchObservedRunningTime="2025-12-03 22:31:44.879762053 +0000 UTC m=+1593.876223402" Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.167459 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.168154 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.261763 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.912496 4830 generic.go:334] "Generic (PLEG): container finished" podID="6fb3b204-2b5a-4dcb-a278-d58ea0dce557" containerID="67ffa2d15cfcc47d2f4802ad7cc2bba4714d99713d0174eb0626d5d9bdfd99a0" exitCode=0 Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.912548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6fb3b204-2b5a-4dcb-a278-d58ea0dce557","Type":"ContainerDied","Data":"67ffa2d15cfcc47d2f4802ad7cc2bba4714d99713d0174eb0626d5d9bdfd99a0"} Dec 03 22:31:50 crc kubenswrapper[4830]: I1203 22:31:50.982195 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:51 crc kubenswrapper[4830]: I1203 22:31:51.046810 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:51 crc kubenswrapper[4830]: I1203 22:31:51.924985 4830 generic.go:334] "Generic (PLEG): container finished" podID="5a2c4f61-6b61-4907-8601-6eea8065d2f6" containerID="5fe4a7a95e30e02d816c461f07ba934fbdf013c73d5fcc12dbc6fef802acca24" exitCode=0 Dec 03 22:31:51 crc kubenswrapper[4830]: I1203 22:31:51.925077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5a2c4f61-6b61-4907-8601-6eea8065d2f6","Type":"ContainerDied","Data":"5fe4a7a95e30e02d816c461f07ba934fbdf013c73d5fcc12dbc6fef802acca24"} Dec 03 22:31:51 crc kubenswrapper[4830]: I1203 22:31:51.928431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6fb3b204-2b5a-4dcb-a278-d58ea0dce557","Type":"ContainerStarted","Data":"2eeaaa9cf62a7706e1649e5b04a14fec0b087524e875db3ec4d36d14815c667c"} Dec 03 22:31:51 crc kubenswrapper[4830]: I1203 22:31:51.928779 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:31:52 crc kubenswrapper[4830]: I1203 22:31:52.940852 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5a2c4f61-6b61-4907-8601-6eea8065d2f6","Type":"ContainerStarted","Data":"e2ef576c947ac441af95db611e618363c2e2399ba455763f09b6e0c04fef3cab"} Dec 03 22:31:52 crc kubenswrapper[4830]: I1203 22:31:52.942075 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dt49n" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="registry-server" containerID="cri-o://c5ba93bff7c646c64716e43ca6d808e81e2abd1424916e5a121bf20719db4d8c" gracePeriod=2 Dec 03 22:31:52 crc kubenswrapper[4830]: I1203 22:31:52.942273 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.003272 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.003249066 podStartE2EDuration="44.003249066s" podCreationTimestamp="2025-12-03 22:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:52.999574867 +0000 UTC m=+1601.996036226" watchObservedRunningTime="2025-12-03 22:31:53.003249066 +0000 UTC m=+1601.999710415" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.013929 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.013911665 podStartE2EDuration="39.013911665s" podCreationTimestamp="2025-12-03 22:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:51.993149348 +0000 UTC m=+1600.989610697" watchObservedRunningTime="2025-12-03 22:31:53.013911665 +0000 UTC m=+1602.010373014" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.859159 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86"] Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.861966 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.865846 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.866067 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.866081 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.866128 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.930548 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86"] Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.955915 4830 generic.go:334] "Generic (PLEG): container finished" podID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerID="c5ba93bff7c646c64716e43ca6d808e81e2abd1424916e5a121bf20719db4d8c" exitCode=0 Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.957172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerDied","Data":"c5ba93bff7c646c64716e43ca6d808e81e2abd1424916e5a121bf20719db4d8c"} Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.969263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xds6q\" (UniqueName: \"kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.969661 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.969726 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:53 crc kubenswrapper[4830]: I1203 22:31:53.969766 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.071336 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xds6q\" (UniqueName: \"kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.071447 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.071467 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.071488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.078622 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.078670 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.088716 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.099361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.100760 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xds6q\" (UniqueName: \"kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.176292 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh527\" (UniqueName: \"kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527\") pod \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.176341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content\") pod \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.176489 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities\") pod \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\" (UID: \"fcbd4ff2-fae6-4680-8017-8d64403b9f03\") " Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.180350 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities" (OuterVolumeSpecName: "utilities") pod "fcbd4ff2-fae6-4680-8017-8d64403b9f03" (UID: "fcbd4ff2-fae6-4680-8017-8d64403b9f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.180564 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527" (OuterVolumeSpecName: "kube-api-access-gh527") pod "fcbd4ff2-fae6-4680-8017-8d64403b9f03" (UID: "fcbd4ff2-fae6-4680-8017-8d64403b9f03"). InnerVolumeSpecName "kube-api-access-gh527". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.196423 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.201449 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcbd4ff2-fae6-4680-8017-8d64403b9f03" (UID: "fcbd4ff2-fae6-4680-8017-8d64403b9f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.279479 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh527\" (UniqueName: \"kubernetes.io/projected/fcbd4ff2-fae6-4680-8017-8d64403b9f03-kube-api-access-gh527\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.279841 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.279851 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcbd4ff2-fae6-4680-8017-8d64403b9f03-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.802395 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86"] Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.967485 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" event={"ID":"c3a26e72-8b75-423c-a151-d576eb7a4128","Type":"ContainerStarted","Data":"ccefe705df1e5e7669f1757e04f507515d50bcfbb25d48b79bc4b11c99a27c8b"} Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.970054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt49n" event={"ID":"fcbd4ff2-fae6-4680-8017-8d64403b9f03","Type":"ContainerDied","Data":"dc24c7df2aa8eab269cda10bcd327d994adc5bc4b05348920bdfd55e35c58580"} Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.970088 4830 scope.go:117] "RemoveContainer" containerID="c5ba93bff7c646c64716e43ca6d808e81e2abd1424916e5a121bf20719db4d8c" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.970147 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt49n" Dec 03 22:31:54 crc kubenswrapper[4830]: I1203 22:31:54.996194 4830 scope.go:117] "RemoveContainer" containerID="ff77bb94ef0466777a9340455548b79dcc105a967397771323195617fa112a23" Dec 03 22:31:55 crc kubenswrapper[4830]: I1203 22:31:55.027259 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:55 crc kubenswrapper[4830]: I1203 22:31:55.036563 4830 scope.go:117] "RemoveContainer" containerID="e93fbef31bbf4f44c443f2675bfa6400e7000052a76350095eef2d2986e26ad3" Dec 03 22:31:55 crc kubenswrapper[4830]: I1203 22:31:55.038563 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt49n"] Dec 03 22:31:55 crc kubenswrapper[4830]: I1203 22:31:55.354095 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" path="/var/lib/kubelet/pods/fcbd4ff2-fae6-4680-8017-8d64403b9f03/volumes" Dec 03 22:31:56 crc kubenswrapper[4830]: I1203 22:31:56.681325 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:31:56 crc kubenswrapper[4830]: I1203 22:31:56.681389 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:32:04 crc kubenswrapper[4830]: I1203 22:32:04.930649 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:32:08 crc kubenswrapper[4830]: I1203 22:32:08.131411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" event={"ID":"c3a26e72-8b75-423c-a151-d576eb7a4128","Type":"ContainerStarted","Data":"aba640a1ac3b59ce97622b3e210ef64239028d74f2ab0bccc13b72b0bba06377"} Dec 03 22:32:08 crc kubenswrapper[4830]: I1203 22:32:08.153566 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" podStartSLOduration=2.9809906 podStartE2EDuration="15.153545242s" podCreationTimestamp="2025-12-03 22:31:53 +0000 UTC" firstStartedPulling="2025-12-03 22:31:54.813244648 +0000 UTC m=+1603.809705997" lastFinishedPulling="2025-12-03 22:32:06.98579929 +0000 UTC m=+1615.982260639" observedRunningTime="2025-12-03 22:32:08.144870248 +0000 UTC m=+1617.141331597" watchObservedRunningTime="2025-12-03 22:32:08.153545242 +0000 UTC m=+1617.150006591" Dec 03 22:32:09 crc kubenswrapper[4830]: I1203 22:32:09.709744 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 22:32:19 crc kubenswrapper[4830]: I1203 22:32:19.453887 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Dec 03 22:32:20 crc kubenswrapper[4830]: I1203 22:32:20.848858 4830 scope.go:117] "RemoveContainer" containerID="53b55af998e0f4cdf29eda24fd09747ad3a661b5682c2892db5e024af9bbd87f" Dec 03 22:32:20 crc kubenswrapper[4830]: I1203 22:32:20.910010 4830 scope.go:117] "RemoveContainer" containerID="19b361512fa4c6f017b9fc0c350f7985f647adac000fa04cc66280d88048adef" Dec 03 22:32:20 crc kubenswrapper[4830]: I1203 22:32:20.945768 4830 scope.go:117] "RemoveContainer" containerID="27139c88accd5c1abc4ea51a562d002e0f7bae4858086f81eba8a195adb9c643" Dec 03 22:32:20 crc kubenswrapper[4830]: I1203 22:32:20.970500 4830 scope.go:117] "RemoveContainer" containerID="cdcda774bd14de42e9e955e8b645bc4c809aadc6cbf62bc94d0018527ec5a70e" Dec 03 22:32:26 crc kubenswrapper[4830]: I1203 22:32:26.680978 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:32:26 crc kubenswrapper[4830]: I1203 22:32:26.681589 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:32:34 crc kubenswrapper[4830]: I1203 22:32:34.431923 4830 generic.go:334] "Generic (PLEG): container finished" podID="c3a26e72-8b75-423c-a151-d576eb7a4128" containerID="aba640a1ac3b59ce97622b3e210ef64239028d74f2ab0bccc13b72b0bba06377" exitCode=0 Dec 03 22:32:34 crc kubenswrapper[4830]: I1203 22:32:34.432008 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" event={"ID":"c3a26e72-8b75-423c-a151-d576eb7a4128","Type":"ContainerDied","Data":"aba640a1ac3b59ce97622b3e210ef64239028d74f2ab0bccc13b72b0bba06377"} Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.019765 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.120680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key\") pod \"c3a26e72-8b75-423c-a151-d576eb7a4128\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.120971 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory\") pod \"c3a26e72-8b75-423c-a151-d576eb7a4128\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.121007 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle\") pod \"c3a26e72-8b75-423c-a151-d576eb7a4128\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.121200 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xds6q\" (UniqueName: \"kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q\") pod \"c3a26e72-8b75-423c-a151-d576eb7a4128\" (UID: \"c3a26e72-8b75-423c-a151-d576eb7a4128\") " Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.131820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q" (OuterVolumeSpecName: "kube-api-access-xds6q") pod "c3a26e72-8b75-423c-a151-d576eb7a4128" (UID: "c3a26e72-8b75-423c-a151-d576eb7a4128"). InnerVolumeSpecName "kube-api-access-xds6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.134081 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c3a26e72-8b75-423c-a151-d576eb7a4128" (UID: "c3a26e72-8b75-423c-a151-d576eb7a4128"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.154105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3a26e72-8b75-423c-a151-d576eb7a4128" (UID: "c3a26e72-8b75-423c-a151-d576eb7a4128"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.156320 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory" (OuterVolumeSpecName: "inventory") pod "c3a26e72-8b75-423c-a151-d576eb7a4128" (UID: "c3a26e72-8b75-423c-a151-d576eb7a4128"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.225166 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xds6q\" (UniqueName: \"kubernetes.io/projected/c3a26e72-8b75-423c-a151-d576eb7a4128-kube-api-access-xds6q\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.225218 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.225239 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.225259 4830 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a26e72-8b75-423c-a151-d576eb7a4128-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.457115 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" event={"ID":"c3a26e72-8b75-423c-a151-d576eb7a4128","Type":"ContainerDied","Data":"ccefe705df1e5e7669f1757e04f507515d50bcfbb25d48b79bc4b11c99a27c8b"} Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.457200 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccefe705df1e5e7669f1757e04f507515d50bcfbb25d48b79bc4b11c99a27c8b" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.457143 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.533485 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb"] Dec 03 22:32:36 crc kubenswrapper[4830]: E1203 22:32:36.534305 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="extract-content" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534321 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="extract-content" Dec 03 22:32:36 crc kubenswrapper[4830]: E1203 22:32:36.534341 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="registry-server" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534349 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="registry-server" Dec 03 22:32:36 crc kubenswrapper[4830]: E1203 22:32:36.534363 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="extract-utilities" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534371 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="extract-utilities" Dec 03 22:32:36 crc kubenswrapper[4830]: E1203 22:32:36.534403 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a26e72-8b75-423c-a151-d576eb7a4128" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534409 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a26e72-8b75-423c-a151-d576eb7a4128" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534632 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcbd4ff2-fae6-4680-8017-8d64403b9f03" containerName="registry-server" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.534652 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a26e72-8b75-423c-a151-d576eb7a4128" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.535379 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.538446 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.538446 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.539701 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.541702 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.559902 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb"] Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.633751 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kcb\" (UniqueName: \"kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.633807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.634297 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.737192 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.737480 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kcb\" (UniqueName: \"kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.737556 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.742326 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.746264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.755035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kcb\" (UniqueName: \"kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6slmb\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:36 crc kubenswrapper[4830]: I1203 22:32:36.870382 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:37 crc kubenswrapper[4830]: I1203 22:32:37.527705 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb"] Dec 03 22:32:37 crc kubenswrapper[4830]: I1203 22:32:37.530425 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:32:38 crc kubenswrapper[4830]: I1203 22:32:38.483483 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" event={"ID":"18583450-269e-412a-99f5-203326569e83","Type":"ContainerStarted","Data":"4d278728b6ef63a7cd7809206755a0ee3ffe9db777253e196b93ea671722a1df"} Dec 03 22:32:39 crc kubenswrapper[4830]: I1203 22:32:39.496653 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" event={"ID":"18583450-269e-412a-99f5-203326569e83","Type":"ContainerStarted","Data":"76fa7b56dc5312294e417820b0d8d25268e657489e8a33c660400affa25719ac"} Dec 03 22:32:40 crc kubenswrapper[4830]: I1203 22:32:40.536481 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" podStartSLOduration=3.50991168 podStartE2EDuration="4.536462523s" podCreationTimestamp="2025-12-03 22:32:36 +0000 UTC" firstStartedPulling="2025-12-03 22:32:37.530209008 +0000 UTC m=+1646.526670347" lastFinishedPulling="2025-12-03 22:32:38.556759841 +0000 UTC m=+1647.553221190" observedRunningTime="2025-12-03 22:32:40.527152422 +0000 UTC m=+1649.523613781" watchObservedRunningTime="2025-12-03 22:32:40.536462523 +0000 UTC m=+1649.532923872" Dec 03 22:32:42 crc kubenswrapper[4830]: I1203 22:32:42.543204 4830 generic.go:334] "Generic (PLEG): container finished" podID="18583450-269e-412a-99f5-203326569e83" containerID="76fa7b56dc5312294e417820b0d8d25268e657489e8a33c660400affa25719ac" exitCode=0 Dec 03 22:32:42 crc kubenswrapper[4830]: I1203 22:32:42.543417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" event={"ID":"18583450-269e-412a-99f5-203326569e83","Type":"ContainerDied","Data":"76fa7b56dc5312294e417820b0d8d25268e657489e8a33c660400affa25719ac"} Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.072906 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.209361 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key\") pod \"18583450-269e-412a-99f5-203326569e83\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.209549 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory\") pod \"18583450-269e-412a-99f5-203326569e83\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.209719 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7kcb\" (UniqueName: \"kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb\") pod \"18583450-269e-412a-99f5-203326569e83\" (UID: \"18583450-269e-412a-99f5-203326569e83\") " Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.217096 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb" (OuterVolumeSpecName: "kube-api-access-k7kcb") pod "18583450-269e-412a-99f5-203326569e83" (UID: "18583450-269e-412a-99f5-203326569e83"). InnerVolumeSpecName "kube-api-access-k7kcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.273550 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory" (OuterVolumeSpecName: "inventory") pod "18583450-269e-412a-99f5-203326569e83" (UID: "18583450-269e-412a-99f5-203326569e83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.273802 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18583450-269e-412a-99f5-203326569e83" (UID: "18583450-269e-412a-99f5-203326569e83"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.312044 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.312082 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18583450-269e-412a-99f5-203326569e83-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.312097 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7kcb\" (UniqueName: \"kubernetes.io/projected/18583450-269e-412a-99f5-203326569e83-kube-api-access-k7kcb\") on node \"crc\" DevicePath \"\"" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.572812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" event={"ID":"18583450-269e-412a-99f5-203326569e83","Type":"ContainerDied","Data":"4d278728b6ef63a7cd7809206755a0ee3ffe9db777253e196b93ea671722a1df"} Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.572874 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d278728b6ef63a7cd7809206755a0ee3ffe9db777253e196b93ea671722a1df" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.572986 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6slmb" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.747642 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7"] Dec 03 22:32:44 crc kubenswrapper[4830]: E1203 22:32:44.748184 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18583450-269e-412a-99f5-203326569e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.748205 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="18583450-269e-412a-99f5-203326569e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.748436 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="18583450-269e-412a-99f5-203326569e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.749200 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.752799 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.754728 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4pjn\" (UniqueName: \"kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.754864 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.755077 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.755243 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.753296 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.753457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.756412 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.775566 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7"] Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.857819 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.858203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.858274 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4pjn\" (UniqueName: \"kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.858296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.863084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.867970 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.868556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:44 crc kubenswrapper[4830]: I1203 22:32:44.874601 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4pjn\" (UniqueName: \"kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:45 crc kubenswrapper[4830]: I1203 22:32:45.091623 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:32:45 crc kubenswrapper[4830]: I1203 22:32:45.665479 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7"] Dec 03 22:32:46 crc kubenswrapper[4830]: I1203 22:32:46.597114 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" event={"ID":"be2984ba-f7dc-4271-ad04-f59c4ad3729d","Type":"ContainerStarted","Data":"d2bb2da737a83c1a1297805bc35b1c374dba87aeca5a1288c80ae9c9be25ae5b"} Dec 03 22:32:46 crc kubenswrapper[4830]: I1203 22:32:46.597527 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" event={"ID":"be2984ba-f7dc-4271-ad04-f59c4ad3729d","Type":"ContainerStarted","Data":"b918938a202156327ce8eab71f21d15ed07ec28674e7c4fb249ac8a2309f8a73"} Dec 03 22:32:46 crc kubenswrapper[4830]: I1203 22:32:46.622667 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" podStartSLOduration=2.004700604 podStartE2EDuration="2.622646326s" podCreationTimestamp="2025-12-03 22:32:44 +0000 UTC" firstStartedPulling="2025-12-03 22:32:45.662641593 +0000 UTC m=+1654.659102942" lastFinishedPulling="2025-12-03 22:32:46.280587305 +0000 UTC m=+1655.277048664" observedRunningTime="2025-12-03 22:32:46.616039328 +0000 UTC m=+1655.612500757" watchObservedRunningTime="2025-12-03 22:32:46.622646326 +0000 UTC m=+1655.619107675" Dec 03 22:32:56 crc kubenswrapper[4830]: I1203 22:32:56.681577 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:32:56 crc kubenswrapper[4830]: I1203 22:32:56.682122 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:32:56 crc kubenswrapper[4830]: I1203 22:32:56.682166 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:32:56 crc kubenswrapper[4830]: I1203 22:32:56.682982 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:32:56 crc kubenswrapper[4830]: I1203 22:32:56.683058 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" gracePeriod=600 Dec 03 22:32:58 crc kubenswrapper[4830]: I1203 22:32:58.762585 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" exitCode=0 Dec 03 22:32:58 crc kubenswrapper[4830]: I1203 22:32:58.762726 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645"} Dec 03 22:32:58 crc kubenswrapper[4830]: I1203 22:32:58.763270 4830 scope.go:117] "RemoveContainer" containerID="d26c80f781595ab631a18d1aaee4115dc9be59de9280be665b2937d58ec45743" Dec 03 22:32:58 crc kubenswrapper[4830]: E1203 22:32:58.995338 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:32:59 crc kubenswrapper[4830]: I1203 22:32:59.778576 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:32:59 crc kubenswrapper[4830]: E1203 22:32:59.778973 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:33:11 crc kubenswrapper[4830]: I1203 22:33:11.353921 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:33:11 crc kubenswrapper[4830]: E1203 22:33:11.356671 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:33:21 crc kubenswrapper[4830]: I1203 22:33:21.151006 4830 scope.go:117] "RemoveContainer" containerID="d54586ae7f972138b491d4d8a5a3b90851ec3a1596fd487fa8203019a4d7ea6c" Dec 03 22:33:22 crc kubenswrapper[4830]: I1203 22:33:22.336616 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:33:22 crc kubenswrapper[4830]: E1203 22:33:22.337174 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:33:34 crc kubenswrapper[4830]: I1203 22:33:34.337444 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:33:34 crc kubenswrapper[4830]: E1203 22:33:34.338624 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:33:49 crc kubenswrapper[4830]: I1203 22:33:49.337151 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:33:49 crc kubenswrapper[4830]: E1203 22:33:49.338336 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:34:02 crc kubenswrapper[4830]: I1203 22:34:02.337439 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:34:02 crc kubenswrapper[4830]: E1203 22:34:02.338801 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:34:15 crc kubenswrapper[4830]: I1203 22:34:15.336766 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:34:15 crc kubenswrapper[4830]: E1203 22:34:15.337546 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.235873 4830 scope.go:117] "RemoveContainer" containerID="3cc8cdda701438d35564c3e5d745b988294c619ef5feaa267b94019bbc58bc23" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.289936 4830 scope.go:117] "RemoveContainer" containerID="03ce774e713a1c2e075856a9a4aefa62679a21b997d09d8326bc6d1f209d0788" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.326901 4830 scope.go:117] "RemoveContainer" containerID="d9d5f16469b8123bc6499d5077a96e2a9678fe79a93885eead62c47e897619f7" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.356612 4830 scope.go:117] "RemoveContainer" containerID="26d8332b3b8aac79ff6d5d59ed3b12be2e1f2ed223f98ed8131694b13868e279" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.405901 4830 scope.go:117] "RemoveContainer" containerID="3363ceeef6da78e752bca7169f942bd2501da74c38fddf7bb4019fc7cb02177c" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.442081 4830 scope.go:117] "RemoveContainer" containerID="7ced4866fca7684317b1866965340d5324e664a98ead6897195ebd44fa8fce5c" Dec 03 22:34:21 crc kubenswrapper[4830]: I1203 22:34:21.481536 4830 scope.go:117] "RemoveContainer" containerID="8eb8d04d323be9531430e60db5b7a09c9649781929f56a5c5029fcea6df97bea" Dec 03 22:34:26 crc kubenswrapper[4830]: I1203 22:34:26.336638 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:34:26 crc kubenswrapper[4830]: E1203 22:34:26.337493 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:34:39 crc kubenswrapper[4830]: I1203 22:34:39.337359 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:34:39 crc kubenswrapper[4830]: E1203 22:34:39.338102 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:34:53 crc kubenswrapper[4830]: I1203 22:34:53.338134 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:34:53 crc kubenswrapper[4830]: E1203 22:34:53.338923 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:35:04 crc kubenswrapper[4830]: I1203 22:35:04.338030 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:35:04 crc kubenswrapper[4830]: E1203 22:35:04.338994 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:35:16 crc kubenswrapper[4830]: I1203 22:35:16.337362 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:35:16 crc kubenswrapper[4830]: E1203 22:35:16.338027 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:35:21 crc kubenswrapper[4830]: I1203 22:35:21.592272 4830 scope.go:117] "RemoveContainer" containerID="acede5cebe2ec492a7f5581f438c944410198aec1103c227865f8442b48df2eb" Dec 03 22:35:21 crc kubenswrapper[4830]: I1203 22:35:21.613734 4830 scope.go:117] "RemoveContainer" containerID="cc5bf7687dfb8cc365f2b312194cd250fa43e7f499bce34a1b981d42d8031039" Dec 03 22:35:21 crc kubenswrapper[4830]: I1203 22:35:21.707326 4830 scope.go:117] "RemoveContainer" containerID="8142bbc1a9e49358537f6fbeec417c4c11dd37da86bf46815b7c43bdcdcc51f0" Dec 03 22:35:29 crc kubenswrapper[4830]: I1203 22:35:29.337649 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:35:29 crc kubenswrapper[4830]: E1203 22:35:29.338418 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:35:42 crc kubenswrapper[4830]: I1203 22:35:42.337253 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:35:42 crc kubenswrapper[4830]: E1203 22:35:42.338311 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:35:44 crc kubenswrapper[4830]: I1203 22:35:44.049834 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-skf7v"] Dec 03 22:35:44 crc kubenswrapper[4830]: I1203 22:35:44.059066 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-skf7v"] Dec 03 22:35:45 crc kubenswrapper[4830]: I1203 22:35:45.036582 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8e4a-account-create-update-65bgn"] Dec 03 22:35:45 crc kubenswrapper[4830]: I1203 22:35:45.045068 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8e4a-account-create-update-65bgn"] Dec 03 22:35:45 crc kubenswrapper[4830]: I1203 22:35:45.352782 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74447e05-2f6d-441a-9a5a-b275cb318a91" path="/var/lib/kubelet/pods/74447e05-2f6d-441a-9a5a-b275cb318a91/volumes" Dec 03 22:35:45 crc kubenswrapper[4830]: I1203 22:35:45.354386 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf109e1-3c06-46c6-a0d2-5cd73bdbc98c" path="/var/lib/kubelet/pods/caf109e1-3c06-46c6-a0d2-5cd73bdbc98c/volumes" Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.033837 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9hhqz"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.045869 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8d55-account-create-update-nqj5k"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.076666 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-89cc-account-create-update-vgvp8"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.090181 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9hhqz"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.102231 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gx7cl"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.112805 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-89cc-account-create-update-vgvp8"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.121684 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8d55-account-create-update-nqj5k"] Dec 03 22:35:48 crc kubenswrapper[4830]: I1203 22:35:48.130228 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gx7cl"] Dec 03 22:35:49 crc kubenswrapper[4830]: I1203 22:35:49.350178 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561f4d1a-2ac0-4446-a60d-922905025583" path="/var/lib/kubelet/pods/561f4d1a-2ac0-4446-a60d-922905025583/volumes" Dec 03 22:35:49 crc kubenswrapper[4830]: I1203 22:35:49.351459 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766590ba-c5f6-426a-8562-bd3440bdbaa0" path="/var/lib/kubelet/pods/766590ba-c5f6-426a-8562-bd3440bdbaa0/volumes" Dec 03 22:35:49 crc kubenswrapper[4830]: I1203 22:35:49.352243 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0df67c3-b98f-4294-9aa8-73ac7efb6b99" path="/var/lib/kubelet/pods/a0df67c3-b98f-4294-9aa8-73ac7efb6b99/volumes" Dec 03 22:35:49 crc kubenswrapper[4830]: I1203 22:35:49.352989 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1431d7e-ecf6-4b69-891b-6522466fafb9" path="/var/lib/kubelet/pods/b1431d7e-ecf6-4b69-891b-6522466fafb9/volumes" Dec 03 22:35:54 crc kubenswrapper[4830]: I1203 22:35:54.338062 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:35:54 crc kubenswrapper[4830]: E1203 22:35:54.339412 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:36:08 crc kubenswrapper[4830]: I1203 22:36:08.338066 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:36:08 crc kubenswrapper[4830]: E1203 22:36:08.339441 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:36:14 crc kubenswrapper[4830]: I1203 22:36:14.144792 4830 generic.go:334] "Generic (PLEG): container finished" podID="be2984ba-f7dc-4271-ad04-f59c4ad3729d" containerID="d2bb2da737a83c1a1297805bc35b1c374dba87aeca5a1288c80ae9c9be25ae5b" exitCode=0 Dec 03 22:36:14 crc kubenswrapper[4830]: I1203 22:36:14.144890 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" event={"ID":"be2984ba-f7dc-4271-ad04-f59c4ad3729d","Type":"ContainerDied","Data":"d2bb2da737a83c1a1297805bc35b1c374dba87aeca5a1288c80ae9c9be25ae5b"} Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.735752 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.877565 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory\") pod \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.877668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4pjn\" (UniqueName: \"kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn\") pod \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.877831 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle\") pod \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.877975 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key\") pod \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\" (UID: \"be2984ba-f7dc-4271-ad04-f59c4ad3729d\") " Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.883390 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn" (OuterVolumeSpecName: "kube-api-access-t4pjn") pod "be2984ba-f7dc-4271-ad04-f59c4ad3729d" (UID: "be2984ba-f7dc-4271-ad04-f59c4ad3729d"). InnerVolumeSpecName "kube-api-access-t4pjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.884190 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "be2984ba-f7dc-4271-ad04-f59c4ad3729d" (UID: "be2984ba-f7dc-4271-ad04-f59c4ad3729d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.917679 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory" (OuterVolumeSpecName: "inventory") pod "be2984ba-f7dc-4271-ad04-f59c4ad3729d" (UID: "be2984ba-f7dc-4271-ad04-f59c4ad3729d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.917693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "be2984ba-f7dc-4271-ad04-f59c4ad3729d" (UID: "be2984ba-f7dc-4271-ad04-f59c4ad3729d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.980495 4830 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.980561 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.980574 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be2984ba-f7dc-4271-ad04-f59c4ad3729d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:36:15 crc kubenswrapper[4830]: I1203 22:36:15.980585 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4pjn\" (UniqueName: \"kubernetes.io/projected/be2984ba-f7dc-4271-ad04-f59c4ad3729d-kube-api-access-t4pjn\") on node \"crc\" DevicePath \"\"" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.165869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" event={"ID":"be2984ba-f7dc-4271-ad04-f59c4ad3729d","Type":"ContainerDied","Data":"b918938a202156327ce8eab71f21d15ed07ec28674e7c4fb249ac8a2309f8a73"} Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.165919 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b918938a202156327ce8eab71f21d15ed07ec28674e7c4fb249ac8a2309f8a73" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.165939 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.267401 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj"] Dec 03 22:36:16 crc kubenswrapper[4830]: E1203 22:36:16.268006 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2984ba-f7dc-4271-ad04-f59c4ad3729d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.268030 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2984ba-f7dc-4271-ad04-f59c4ad3729d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.268311 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2984ba-f7dc-4271-ad04-f59c4ad3729d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.269241 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.271497 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.271889 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.272800 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.273141 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.277319 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj"] Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.389655 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz5gq\" (UniqueName: \"kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.389875 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.390337 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.494846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.495264 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz5gq\" (UniqueName: \"kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.495340 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.500775 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.508122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.511848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz5gq\" (UniqueName: \"kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:16 crc kubenswrapper[4830]: I1203 22:36:16.607611 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:36:17 crc kubenswrapper[4830]: I1203 22:36:17.041111 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-w9fdb"] Dec 03 22:36:17 crc kubenswrapper[4830]: I1203 22:36:17.054805 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-w9fdb"] Dec 03 22:36:17 crc kubenswrapper[4830]: I1203 22:36:17.128176 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj"] Dec 03 22:36:17 crc kubenswrapper[4830]: I1203 22:36:17.175327 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" event={"ID":"1f140812-ce94-43b3-bdec-466d8a3d2417","Type":"ContainerStarted","Data":"62f5dff72328067489c99ee6e2ee8b3cfa8106f9d43d39e9f86f0a842ebb5596"} Dec 03 22:36:18 crc kubenswrapper[4830]: I1203 22:36:18.585846 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba568a6-43aa-4368-a33f-50b182c1faf8" path="/var/lib/kubelet/pods/5ba568a6-43aa-4368-a33f-50b182c1faf8/volumes" Dec 03 22:36:18 crc kubenswrapper[4830]: E1203 22:36:18.587622 4830 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.251s" Dec 03 22:36:19 crc kubenswrapper[4830]: I1203 22:36:19.201452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" event={"ID":"1f140812-ce94-43b3-bdec-466d8a3d2417","Type":"ContainerStarted","Data":"47e14a62fc69d8bad0769732d296ab705b03c0fde7741d1916554edf797d7d93"} Dec 03 22:36:19 crc kubenswrapper[4830]: I1203 22:36:19.220724 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" podStartSLOduration=1.656167783 podStartE2EDuration="3.220704308s" podCreationTimestamp="2025-12-03 22:36:16 +0000 UTC" firstStartedPulling="2025-12-03 22:36:17.126022048 +0000 UTC m=+1866.122483397" lastFinishedPulling="2025-12-03 22:36:18.690558573 +0000 UTC m=+1867.687019922" observedRunningTime="2025-12-03 22:36:19.217964724 +0000 UTC m=+1868.214426093" watchObservedRunningTime="2025-12-03 22:36:19.220704308 +0000 UTC m=+1868.217165677" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.035240 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-qrsbk"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.046690 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-aa18-account-create-update-l7lnv"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.057640 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-qrsbk"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.069065 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-aa18-account-create-update-l7lnv"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.084481 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-75hs2"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.093937 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-75hs2"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.103446 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-58de-account-create-update-nxbc7"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.112688 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-58de-account-create-update-nxbc7"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.122942 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xdgf6"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.133022 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pkmrj"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.141676 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3133-account-create-update-2jnjh"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.150083 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3f6b-account-create-update-sbkkw"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.159095 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pkmrj"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.193030 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xdgf6"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.205623 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3133-account-create-update-2jnjh"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.212520 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3f6b-account-create-update-sbkkw"] Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.348996 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15003660-6e4d-427f-8386-f4be36fc25f8" path="/var/lib/kubelet/pods/15003660-6e4d-427f-8386-f4be36fc25f8/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.349974 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd0383c-30e5-4787-9d20-fe1c5de32b62" path="/var/lib/kubelet/pods/2fd0383c-30e5-4787-9d20-fe1c5de32b62/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.350985 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c30a279-b67f-46c3-980e-e38b9cc27eb9" path="/var/lib/kubelet/pods/5c30a279-b67f-46c3-980e-e38b9cc27eb9/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.351779 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600b2e52-def7-4084-982d-5ccef14d35fd" path="/var/lib/kubelet/pods/600b2e52-def7-4084-982d-5ccef14d35fd/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.353044 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698c1d5e-41e8-4701-b3d3-81b015482ff1" path="/var/lib/kubelet/pods/698c1d5e-41e8-4701-b3d3-81b015482ff1/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.354047 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aceb571f-95c2-46b6-a84a-fdd448fdc167" path="/var/lib/kubelet/pods/aceb571f-95c2-46b6-a84a-fdd448fdc167/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.355259 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b445f62c-c9c8-488f-ad3c-4fd162cb1092" path="/var/lib/kubelet/pods/b445f62c-c9c8-488f-ad3c-4fd162cb1092/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.357424 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece3f713-2d71-4b1f-a20c-504f9e2dec24" path="/var/lib/kubelet/pods/ece3f713-2d71-4b1f-a20c-504f9e2dec24/volumes" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.787950 4830 scope.go:117] "RemoveContainer" containerID="11b0655835c105a17984c97cf10b835f59090637aeb762ab8ee59130236101c1" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.811617 4830 scope.go:117] "RemoveContainer" containerID="01d4117f8c2746ef4c1bab19aadcc265bc1db4c5c7b498fd49b5899811834aac" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.873693 4830 scope.go:117] "RemoveContainer" containerID="7dbf4adab0384eff2f5dfd5372df3919911593d045a4634cf49dfd6b43a83dfa" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.913967 4830 scope.go:117] "RemoveContainer" containerID="2fb09a216cdc3994f100acf5555012c5fe650752d985e5588d408afd4fdcc853" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.951151 4830 scope.go:117] "RemoveContainer" containerID="2502067c4ad33697b5c33b31f0ff06320937b3779cd0d6e8d0a183ca157836e0" Dec 03 22:36:21 crc kubenswrapper[4830]: I1203 22:36:21.987677 4830 scope.go:117] "RemoveContainer" containerID="ea1df43caaca0ac62e8a3f3f6c90cf718bce8a0d68992aabab8cd0084c1eadf5" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.013898 4830 scope.go:117] "RemoveContainer" containerID="9a459ab196641fd8ee1cab6be469ade4a83ac9f84fc13930d676caad58206ba1" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.066859 4830 scope.go:117] "RemoveContainer" containerID="0ed64c44fe8b23d64eec54eb01a7a7bced72448258e02568a28dd674e97f77e0" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.120136 4830 scope.go:117] "RemoveContainer" containerID="fdfe759c0a5830ae7a8b9cb69bde7685d6ffd3f470eac868625a72890e05553e" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.139535 4830 scope.go:117] "RemoveContainer" containerID="20cac797d0ba41b661a2bb06b3a8f213c3f1e241fd77bede695545c27f6a1243" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.164352 4830 scope.go:117] "RemoveContainer" containerID="1b348f93f5e718fd1311fcce5b4054d2a45bafdbfb9837350e4b193e50f7aeb7" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.187304 4830 scope.go:117] "RemoveContainer" containerID="9ed2b8a980b371638c2496213d6a5f2b6faade6e2cd927aa1e9578cd86308511" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.207939 4830 scope.go:117] "RemoveContainer" containerID="dfa4d1d9a173f4260396b1f874bc8f68e8891bf73bc1bc8071a97e792dab79ad" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.240244 4830 scope.go:117] "RemoveContainer" containerID="931d028b345093106aeddc013ab15181e99804a7852b95d6fe5f29ad3f446c92" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.259856 4830 scope.go:117] "RemoveContainer" containerID="7e67ba5c6a8156a872c1de79b612ceb9edec00a2e0c19a4a325b34a372d65dca" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.289162 4830 scope.go:117] "RemoveContainer" containerID="4e43c513a887fc31d9893f9f16f61bbd0965d9f769e06d7765a8c818238b3325" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.311662 4830 scope.go:117] "RemoveContainer" containerID="3a84c594a8dcde177c39dc868c2b7ee8b8294d617e388f110d43ee4af4f7e69f" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.339680 4830 scope.go:117] "RemoveContainer" containerID="ef9b41d81a3416d5b15a18e15ae8a2717234d9af53878c2952084f7c9a016e31" Dec 03 22:36:22 crc kubenswrapper[4830]: I1203 22:36:22.371618 4830 scope.go:117] "RemoveContainer" containerID="009586eddc93b97e8df96e663be669b2bea03269ac57f3b16b1d7848175b1c1d" Dec 03 22:36:23 crc kubenswrapper[4830]: I1203 22:36:23.337171 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:36:23 crc kubenswrapper[4830]: E1203 22:36:23.337812 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:36:26 crc kubenswrapper[4830]: I1203 22:36:26.033262 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6j6q9"] Dec 03 22:36:26 crc kubenswrapper[4830]: I1203 22:36:26.043133 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6j6q9"] Dec 03 22:36:27 crc kubenswrapper[4830]: I1203 22:36:27.354874 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a872d21-c6ee-47bc-a31e-a529bd4e2ff4" path="/var/lib/kubelet/pods/2a872d21-c6ee-47bc-a31e-a529bd4e2ff4/volumes" Dec 03 22:36:38 crc kubenswrapper[4830]: I1203 22:36:38.337777 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:36:38 crc kubenswrapper[4830]: E1203 22:36:38.338771 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:36:49 crc kubenswrapper[4830]: I1203 22:36:49.338296 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:36:49 crc kubenswrapper[4830]: E1203 22:36:49.339259 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:37:02 crc kubenswrapper[4830]: I1203 22:37:02.337241 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:37:02 crc kubenswrapper[4830]: E1203 22:37:02.338062 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:37:03 crc kubenswrapper[4830]: I1203 22:37:03.049601 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4g65n"] Dec 03 22:37:03 crc kubenswrapper[4830]: I1203 22:37:03.062542 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4g65n"] Dec 03 22:37:03 crc kubenswrapper[4830]: I1203 22:37:03.348881 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cee81fc-684a-4fd2-886a-d899c16a5f8b" path="/var/lib/kubelet/pods/2cee81fc-684a-4fd2-886a-d899c16a5f8b/volumes" Dec 03 22:37:14 crc kubenswrapper[4830]: I1203 22:37:14.337268 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:37:14 crc kubenswrapper[4830]: E1203 22:37:14.338148 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:37:17 crc kubenswrapper[4830]: I1203 22:37:17.051075 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rjqfw"] Dec 03 22:37:17 crc kubenswrapper[4830]: I1203 22:37:17.063023 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rjqfw"] Dec 03 22:37:17 crc kubenswrapper[4830]: I1203 22:37:17.349615 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f130ecd2-4428-4f1e-a386-7084fc52689b" path="/var/lib/kubelet/pods/f130ecd2-4428-4f1e-a386-7084fc52689b/volumes" Dec 03 22:37:19 crc kubenswrapper[4830]: I1203 22:37:19.058120 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l6rwx"] Dec 03 22:37:19 crc kubenswrapper[4830]: I1203 22:37:19.071613 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l6rwx"] Dec 03 22:37:19 crc kubenswrapper[4830]: I1203 22:37:19.351399 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6917fbd-195a-4b58-83e0-27988c69b57d" path="/var/lib/kubelet/pods/e6917fbd-195a-4b58-83e0-27988c69b57d/volumes" Dec 03 22:37:22 crc kubenswrapper[4830]: I1203 22:37:22.647912 4830 scope.go:117] "RemoveContainer" containerID="a4b37c265b5d4fd9e3da2072f0efcfb1376ff8ea294f05fd856a5b2409bc42d3" Dec 03 22:37:22 crc kubenswrapper[4830]: I1203 22:37:22.685550 4830 scope.go:117] "RemoveContainer" containerID="17947ed7c1c22b53408db6b030acdd4dbae1374c362e3ce0d46a0ddd0f87d478" Dec 03 22:37:22 crc kubenswrapper[4830]: I1203 22:37:22.738917 4830 scope.go:117] "RemoveContainer" containerID="f43cb1a53606b67fa1a754ad6d6b456f1bf7339c9f781128e798c0ac6f3f4098" Dec 03 22:37:22 crc kubenswrapper[4830]: I1203 22:37:22.796019 4830 scope.go:117] "RemoveContainer" containerID="2e900a8f197bb84a2334d2d3e0082a9759630f7ff33a9805947d537402c76536" Dec 03 22:37:27 crc kubenswrapper[4830]: I1203 22:37:27.337471 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:37:27 crc kubenswrapper[4830]: E1203 22:37:27.338102 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:37:33 crc kubenswrapper[4830]: I1203 22:37:33.040522 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-94wsw"] Dec 03 22:37:33 crc kubenswrapper[4830]: I1203 22:37:33.056647 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-94wsw"] Dec 03 22:37:33 crc kubenswrapper[4830]: I1203 22:37:33.348248 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08a2474-4282-4814-8d14-438d92f1c593" path="/var/lib/kubelet/pods/e08a2474-4282-4814-8d14-438d92f1c593/volumes" Dec 03 22:37:34 crc kubenswrapper[4830]: I1203 22:37:34.034756 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-z5lqr"] Dec 03 22:37:34 crc kubenswrapper[4830]: I1203 22:37:34.047480 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-z5lqr"] Dec 03 22:37:35 crc kubenswrapper[4830]: I1203 22:37:35.357000 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9793d86-3f26-4443-b740-c2bbcc65f58c" path="/var/lib/kubelet/pods/a9793d86-3f26-4443-b740-c2bbcc65f58c/volumes" Dec 03 22:37:41 crc kubenswrapper[4830]: I1203 22:37:41.345530 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:37:41 crc kubenswrapper[4830]: E1203 22:37:41.346981 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:37:52 crc kubenswrapper[4830]: I1203 22:37:52.337023 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:37:52 crc kubenswrapper[4830]: E1203 22:37:52.337934 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:38:03 crc kubenswrapper[4830]: I1203 22:38:03.337163 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:38:04 crc kubenswrapper[4830]: I1203 22:38:04.338773 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e"} Dec 03 22:38:06 crc kubenswrapper[4830]: I1203 22:38:06.362408 4830 generic.go:334] "Generic (PLEG): container finished" podID="1f140812-ce94-43b3-bdec-466d8a3d2417" containerID="47e14a62fc69d8bad0769732d296ab705b03c0fde7741d1916554edf797d7d93" exitCode=0 Dec 03 22:38:06 crc kubenswrapper[4830]: I1203 22:38:06.362492 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" event={"ID":"1f140812-ce94-43b3-bdec-466d8a3d2417","Type":"ContainerDied","Data":"47e14a62fc69d8bad0769732d296ab705b03c0fde7741d1916554edf797d7d93"} Dec 03 22:38:07 crc kubenswrapper[4830]: I1203 22:38:07.913861 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.050399 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory\") pod \"1f140812-ce94-43b3-bdec-466d8a3d2417\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.050573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key\") pod \"1f140812-ce94-43b3-bdec-466d8a3d2417\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.050599 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz5gq\" (UniqueName: \"kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq\") pod \"1f140812-ce94-43b3-bdec-466d8a3d2417\" (UID: \"1f140812-ce94-43b3-bdec-466d8a3d2417\") " Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.064115 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq" (OuterVolumeSpecName: "kube-api-access-wz5gq") pod "1f140812-ce94-43b3-bdec-466d8a3d2417" (UID: "1f140812-ce94-43b3-bdec-466d8a3d2417"). InnerVolumeSpecName "kube-api-access-wz5gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.081316 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory" (OuterVolumeSpecName: "inventory") pod "1f140812-ce94-43b3-bdec-466d8a3d2417" (UID: "1f140812-ce94-43b3-bdec-466d8a3d2417"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.082813 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f140812-ce94-43b3-bdec-466d8a3d2417" (UID: "1f140812-ce94-43b3-bdec-466d8a3d2417"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.153258 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.153290 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz5gq\" (UniqueName: \"kubernetes.io/projected/1f140812-ce94-43b3-bdec-466d8a3d2417-kube-api-access-wz5gq\") on node \"crc\" DevicePath \"\"" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.153301 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f140812-ce94-43b3-bdec-466d8a3d2417-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.385759 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" event={"ID":"1f140812-ce94-43b3-bdec-466d8a3d2417","Type":"ContainerDied","Data":"62f5dff72328067489c99ee6e2ee8b3cfa8106f9d43d39e9f86f0a842ebb5596"} Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.386200 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f5dff72328067489c99ee6e2ee8b3cfa8106f9d43d39e9f86f0a842ebb5596" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.385822 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.484459 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp"] Dec 03 22:38:08 crc kubenswrapper[4830]: E1203 22:38:08.484921 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f140812-ce94-43b3-bdec-466d8a3d2417" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.484941 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f140812-ce94-43b3-bdec-466d8a3d2417" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.485147 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f140812-ce94-43b3-bdec-466d8a3d2417" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.485937 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.488934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.488946 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.489155 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.489871 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.509053 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp"] Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.561688 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.561795 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.562003 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2ccc\" (UniqueName: \"kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.663866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2ccc\" (UniqueName: \"kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.664062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.664103 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.670786 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.671340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.682807 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2ccc\" (UniqueName: \"kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5zttp\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:08 crc kubenswrapper[4830]: I1203 22:38:08.809397 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:38:09 crc kubenswrapper[4830]: I1203 22:38:09.420563 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:38:09 crc kubenswrapper[4830]: I1203 22:38:09.420718 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp"] Dec 03 22:38:10 crc kubenswrapper[4830]: I1203 22:38:10.415323 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" event={"ID":"30423410-ddd7-4a94-8a3a-b20b6dfd64c8","Type":"ContainerStarted","Data":"4c32bc82326f5084f430a076ddf2952dd99e613ec2ac1122afc18ce34c2ae327"} Dec 03 22:38:10 crc kubenswrapper[4830]: I1203 22:38:10.415861 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" event={"ID":"30423410-ddd7-4a94-8a3a-b20b6dfd64c8","Type":"ContainerStarted","Data":"026cbe14a428e47f13aa2e597d72a601f89922b853b1f16d794baec7e85a8028"} Dec 03 22:38:10 crc kubenswrapper[4830]: I1203 22:38:10.451262 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" podStartSLOduration=2.024362017 podStartE2EDuration="2.451236066s" podCreationTimestamp="2025-12-03 22:38:08 +0000 UTC" firstStartedPulling="2025-12-03 22:38:09.420275575 +0000 UTC m=+1978.416736924" lastFinishedPulling="2025-12-03 22:38:09.847149624 +0000 UTC m=+1978.843610973" observedRunningTime="2025-12-03 22:38:10.438687937 +0000 UTC m=+1979.435149306" watchObservedRunningTime="2025-12-03 22:38:10.451236066 +0000 UTC m=+1979.447697425" Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.048605 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-77a6-account-create-update-4jwrz"] Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.081692 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fb1a-account-create-update-mv227"] Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.094536 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d6c5m"] Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.105606 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-77a6-account-create-update-4jwrz"] Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.113906 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d6c5m"] Dec 03 22:38:12 crc kubenswrapper[4830]: I1203 22:38:12.133471 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fb1a-account-create-update-mv227"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.032332 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gr4gs"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.042735 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rkz52"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.054041 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gr4gs"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.063682 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-616a-account-create-update-rvrn5"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.072438 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rkz52"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.080861 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-616a-account-create-update-rvrn5"] Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.350379 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a32af7-077b-4320-bdc0-a208b4132154" path="/var/lib/kubelet/pods/05a32af7-077b-4320-bdc0-a208b4132154/volumes" Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.351215 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ff28df-2c9e-4694-a75d-9b7dd825c575" path="/var/lib/kubelet/pods/37ff28df-2c9e-4694-a75d-9b7dd825c575/volumes" Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.351952 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d8b7eb-e5b5-4445-9b9d-2a0472500fea" path="/var/lib/kubelet/pods/42d8b7eb-e5b5-4445-9b9d-2a0472500fea/volumes" Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.352549 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630ed790-c3e9-471e-96a4-8f3c63aa837e" path="/var/lib/kubelet/pods/630ed790-c3e9-471e-96a4-8f3c63aa837e/volumes" Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.353613 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67505812-b68d-4b56-b353-749d2f12dcdd" path="/var/lib/kubelet/pods/67505812-b68d-4b56-b353-749d2f12dcdd/volumes" Dec 03 22:38:13 crc kubenswrapper[4830]: I1203 22:38:13.354186 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab83afa3-0096-4c4d-804e-656d1c5de542" path="/var/lib/kubelet/pods/ab83afa3-0096-4c4d-804e-656d1c5de542/volumes" Dec 03 22:38:22 crc kubenswrapper[4830]: I1203 22:38:22.933402 4830 scope.go:117] "RemoveContainer" containerID="9d68b27bfcbbe782e3d3d689df2dc58130c516865ed56155f590e7bed57cae04" Dec 03 22:38:22 crc kubenswrapper[4830]: I1203 22:38:22.974134 4830 scope.go:117] "RemoveContainer" containerID="44fdaccdcd8d7a47e8d2039990ec16eab1d6a698b38bb6a5f9019e5c18399c69" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.048249 4830 scope.go:117] "RemoveContainer" containerID="f3995af32947864f39da4fb01355ffba1b0884dba503918c90f6f07ef9014fdd" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.092072 4830 scope.go:117] "RemoveContainer" containerID="d1d1841f0fac9bace30726b0e572184eb687046ada40ac2f7cdd7112b3eec89d" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.147858 4830 scope.go:117] "RemoveContainer" containerID="91a922f2854aaf25588a516e247bf912435bcd2fb83a3771a3bb4c41c9d538c9" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.192906 4830 scope.go:117] "RemoveContainer" containerID="6296868ed56ae39905acb727e8ee3bb0f53adafefcadfa044fbed1f2e7949f90" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.262974 4830 scope.go:117] "RemoveContainer" containerID="69904b4d6f7cc0c217f507c0d5363fac25eb8fd1c6609b4d4e245c400e6ca3e7" Dec 03 22:38:23 crc kubenswrapper[4830]: I1203 22:38:23.298825 4830 scope.go:117] "RemoveContainer" containerID="381efed19fedf1a9e8a7dbc42373574a640f624f97ff7a6dc20f1c961a645142" Dec 03 22:38:41 crc kubenswrapper[4830]: I1203 22:38:41.045167 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nb55g"] Dec 03 22:38:41 crc kubenswrapper[4830]: I1203 22:38:41.056882 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nb55g"] Dec 03 22:38:41 crc kubenswrapper[4830]: I1203 22:38:41.358854 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026064cc-f701-4574-baf6-261e157061a8" path="/var/lib/kubelet/pods/026064cc-f701-4574-baf6-261e157061a8/volumes" Dec 03 22:39:00 crc kubenswrapper[4830]: I1203 22:39:00.050402 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mlqw8"] Dec 03 22:39:00 crc kubenswrapper[4830]: I1203 22:39:00.066197 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mlqw8"] Dec 03 22:39:01 crc kubenswrapper[4830]: I1203 22:39:01.027233 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-v8fhw"] Dec 03 22:39:01 crc kubenswrapper[4830]: I1203 22:39:01.037123 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-v8fhw"] Dec 03 22:39:01 crc kubenswrapper[4830]: I1203 22:39:01.351935 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a58c35e-a16d-4014-a958-67f2a5461287" path="/var/lib/kubelet/pods/2a58c35e-a16d-4014-a958-67f2a5461287/volumes" Dec 03 22:39:01 crc kubenswrapper[4830]: I1203 22:39:01.352727 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ce7502-4906-4ec8-941a-04aa6486cf93" path="/var/lib/kubelet/pods/79ce7502-4906-4ec8-941a-04aa6486cf93/volumes" Dec 03 22:39:23 crc kubenswrapper[4830]: I1203 22:39:23.500618 4830 scope.go:117] "RemoveContainer" containerID="79fa30ed86e17dd6eea827b5b8b9a7f4f692e4153bc0ee24a942805f9cd53759" Dec 03 22:39:23 crc kubenswrapper[4830]: I1203 22:39:23.541755 4830 scope.go:117] "RemoveContainer" containerID="d1d5ec0b211e31aa4976e4d8ac3cf611cfe8a423df0c18f32ddc6b1e759d6e3f" Dec 03 22:39:23 crc kubenswrapper[4830]: I1203 22:39:23.605931 4830 scope.go:117] "RemoveContainer" containerID="ce7de5cea40f94b237ce0a817171e7fc448448770ce0e17bd595172e4545c28c" Dec 03 22:39:26 crc kubenswrapper[4830]: I1203 22:39:26.236581 4830 generic.go:334] "Generic (PLEG): container finished" podID="30423410-ddd7-4a94-8a3a-b20b6dfd64c8" containerID="4c32bc82326f5084f430a076ddf2952dd99e613ec2ac1122afc18ce34c2ae327" exitCode=0 Dec 03 22:39:26 crc kubenswrapper[4830]: I1203 22:39:26.236669 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" event={"ID":"30423410-ddd7-4a94-8a3a-b20b6dfd64c8","Type":"ContainerDied","Data":"4c32bc82326f5084f430a076ddf2952dd99e613ec2ac1122afc18ce34c2ae327"} Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.737850 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.835590 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2ccc\" (UniqueName: \"kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc\") pod \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.835766 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory\") pod \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.835837 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key\") pod \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\" (UID: \"30423410-ddd7-4a94-8a3a-b20b6dfd64c8\") " Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.840932 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc" (OuterVolumeSpecName: "kube-api-access-n2ccc") pod "30423410-ddd7-4a94-8a3a-b20b6dfd64c8" (UID: "30423410-ddd7-4a94-8a3a-b20b6dfd64c8"). InnerVolumeSpecName "kube-api-access-n2ccc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.938797 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2ccc\" (UniqueName: \"kubernetes.io/projected/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-kube-api-access-n2ccc\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.951213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30423410-ddd7-4a94-8a3a-b20b6dfd64c8" (UID: "30423410-ddd7-4a94-8a3a-b20b6dfd64c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:39:27 crc kubenswrapper[4830]: I1203 22:39:27.960974 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory" (OuterVolumeSpecName: "inventory") pod "30423410-ddd7-4a94-8a3a-b20b6dfd64c8" (UID: "30423410-ddd7-4a94-8a3a-b20b6dfd64c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.040435 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.040480 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30423410-ddd7-4a94-8a3a-b20b6dfd64c8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.255417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" event={"ID":"30423410-ddd7-4a94-8a3a-b20b6dfd64c8","Type":"ContainerDied","Data":"026cbe14a428e47f13aa2e597d72a601f89922b853b1f16d794baec7e85a8028"} Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.255775 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026cbe14a428e47f13aa2e597d72a601f89922b853b1f16d794baec7e85a8028" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.255467 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5zttp" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.346335 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw"] Dec 03 22:39:28 crc kubenswrapper[4830]: E1203 22:39:28.346800 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30423410-ddd7-4a94-8a3a-b20b6dfd64c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.346822 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="30423410-ddd7-4a94-8a3a-b20b6dfd64c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.347091 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="30423410-ddd7-4a94-8a3a-b20b6dfd64c8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.347888 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.352295 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.352361 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.352576 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.353880 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.395615 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw"] Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.448737 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.448820 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzw8\" (UniqueName: \"kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.448923 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.551773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.551878 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzw8\" (UniqueName: \"kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.552046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.556698 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.557063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.584164 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzw8\" (UniqueName: \"kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:28 crc kubenswrapper[4830]: I1203 22:39:28.665636 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:29 crc kubenswrapper[4830]: I1203 22:39:29.305340 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw"] Dec 03 22:39:30 crc kubenswrapper[4830]: I1203 22:39:30.274281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" event={"ID":"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5","Type":"ContainerStarted","Data":"c8ec8bd127888d4625d6614984ef528d823856f4a366fa501d95b7af9302c934"} Dec 03 22:39:30 crc kubenswrapper[4830]: I1203 22:39:30.274593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" event={"ID":"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5","Type":"ContainerStarted","Data":"70000b2ec71382aa3aaa63f889828e82c73fa4aa2089c321706cd95682d7738d"} Dec 03 22:39:30 crc kubenswrapper[4830]: I1203 22:39:30.289401 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" podStartSLOduration=1.762619567 podStartE2EDuration="2.289382488s" podCreationTimestamp="2025-12-03 22:39:28 +0000 UTC" firstStartedPulling="2025-12-03 22:39:29.317629939 +0000 UTC m=+2058.314091288" lastFinishedPulling="2025-12-03 22:39:29.84439286 +0000 UTC m=+2058.840854209" observedRunningTime="2025-12-03 22:39:30.287779384 +0000 UTC m=+2059.284240733" watchObservedRunningTime="2025-12-03 22:39:30.289382488 +0000 UTC m=+2059.285843837" Dec 03 22:39:35 crc kubenswrapper[4830]: I1203 22:39:35.335115 4830 generic.go:334] "Generic (PLEG): container finished" podID="14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" containerID="c8ec8bd127888d4625d6614984ef528d823856f4a366fa501d95b7af9302c934" exitCode=0 Dec 03 22:39:35 crc kubenswrapper[4830]: I1203 22:39:35.335231 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" event={"ID":"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5","Type":"ContainerDied","Data":"c8ec8bd127888d4625d6614984ef528d823856f4a366fa501d95b7af9302c934"} Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.872853 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.945946 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzw8\" (UniqueName: \"kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8\") pod \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.946028 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key\") pod \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.946079 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory\") pod \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\" (UID: \"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5\") " Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.951821 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8" (OuterVolumeSpecName: "kube-api-access-nfzw8") pod "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" (UID: "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5"). InnerVolumeSpecName "kube-api-access-nfzw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.976820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" (UID: "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:39:36 crc kubenswrapper[4830]: I1203 22:39:36.977757 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory" (OuterVolumeSpecName: "inventory") pod "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" (UID: "14a22d3c-4a83-41b5-b9e7-7862b7b62ab5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.048084 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzw8\" (UniqueName: \"kubernetes.io/projected/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-kube-api-access-nfzw8\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.048114 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.048124 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a22d3c-4a83-41b5-b9e7-7862b7b62ab5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.355211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" event={"ID":"14a22d3c-4a83-41b5-b9e7-7862b7b62ab5","Type":"ContainerDied","Data":"70000b2ec71382aa3aaa63f889828e82c73fa4aa2089c321706cd95682d7738d"} Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.355260 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70000b2ec71382aa3aaa63f889828e82c73fa4aa2089c321706cd95682d7738d" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.355359 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.437394 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf"] Dec 03 22:39:37 crc kubenswrapper[4830]: E1203 22:39:37.439123 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.439151 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.439372 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a22d3c-4a83-41b5-b9e7-7862b7b62ab5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.440289 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.443482 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.443568 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.444190 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.449333 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.453001 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf"] Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.455873 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvd5\" (UniqueName: \"kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.456139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.456200 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.558233 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.558335 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.558470 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvd5\" (UniqueName: \"kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.567181 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.571031 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.577982 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvd5\" (UniqueName: \"kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kk4xf\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:37 crc kubenswrapper[4830]: I1203 22:39:37.758187 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:39:38 crc kubenswrapper[4830]: I1203 22:39:38.323401 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf"] Dec 03 22:39:38 crc kubenswrapper[4830]: I1203 22:39:38.371760 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" event={"ID":"4fc52ab9-68d4-4a61-a92f-8de75a563adc","Type":"ContainerStarted","Data":"7a406612309813069dd5c9b38bf0aa14a55a7cf6cf839a6f2f4ee5fde595e902"} Dec 03 22:39:39 crc kubenswrapper[4830]: I1203 22:39:39.397216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" event={"ID":"4fc52ab9-68d4-4a61-a92f-8de75a563adc","Type":"ContainerStarted","Data":"7f2b30e230af682a0c7d1252a5d9b470fa6c2ca6dd6660cbdfae5a38304b7b55"} Dec 03 22:39:39 crc kubenswrapper[4830]: I1203 22:39:39.420082 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" podStartSLOduration=2.004895758 podStartE2EDuration="2.420056569s" podCreationTimestamp="2025-12-03 22:39:37 +0000 UTC" firstStartedPulling="2025-12-03 22:39:38.330298368 +0000 UTC m=+2067.326759757" lastFinishedPulling="2025-12-03 22:39:38.745459219 +0000 UTC m=+2067.741920568" observedRunningTime="2025-12-03 22:39:39.413130562 +0000 UTC m=+2068.409591921" watchObservedRunningTime="2025-12-03 22:39:39.420056569 +0000 UTC m=+2068.416517928" Dec 03 22:39:47 crc kubenswrapper[4830]: I1203 22:39:47.051490 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7zlj6"] Dec 03 22:39:47 crc kubenswrapper[4830]: I1203 22:39:47.062088 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7zlj6"] Dec 03 22:39:47 crc kubenswrapper[4830]: I1203 22:39:47.349194 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7" path="/var/lib/kubelet/pods/c88e63f3-37ab-4eb8-8fe7-c2d875e62bb7/volumes" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.345906 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.354217 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.366637 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.506176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.506310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.506551 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nqn\" (UniqueName: \"kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.609706 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nqn\" (UniqueName: \"kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.609892 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.609995 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.618231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.618254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.632722 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nqn\" (UniqueName: \"kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn\") pod \"redhat-operators-8m2xm\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:06 crc kubenswrapper[4830]: I1203 22:40:06.689430 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:07 crc kubenswrapper[4830]: I1203 22:40:07.203989 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:07 crc kubenswrapper[4830]: I1203 22:40:07.704843 4830 generic.go:334] "Generic (PLEG): container finished" podID="85719986-300c-4c54-8c62-997c584c27dd" containerID="761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163" exitCode=0 Dec 03 22:40:07 crc kubenswrapper[4830]: I1203 22:40:07.705032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerDied","Data":"761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163"} Dec 03 22:40:07 crc kubenswrapper[4830]: I1203 22:40:07.705157 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerStarted","Data":"235b4a0abe30e0dd9d387a00ee37ee8ad4bd2de9b1e80b25ba27f4b90d46a665"} Dec 03 22:40:08 crc kubenswrapper[4830]: I1203 22:40:08.714110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerStarted","Data":"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb"} Dec 03 22:40:12 crc kubenswrapper[4830]: I1203 22:40:12.754187 4830 generic.go:334] "Generic (PLEG): container finished" podID="85719986-300c-4c54-8c62-997c584c27dd" containerID="6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb" exitCode=0 Dec 03 22:40:12 crc kubenswrapper[4830]: I1203 22:40:12.754692 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerDied","Data":"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb"} Dec 03 22:40:14 crc kubenswrapper[4830]: I1203 22:40:14.773991 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerStarted","Data":"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69"} Dec 03 22:40:14 crc kubenswrapper[4830]: I1203 22:40:14.797852 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8m2xm" podStartSLOduration=3.129403589 podStartE2EDuration="8.797829528s" podCreationTimestamp="2025-12-03 22:40:06 +0000 UTC" firstStartedPulling="2025-12-03 22:40:07.706790407 +0000 UTC m=+2096.703251756" lastFinishedPulling="2025-12-03 22:40:13.375216346 +0000 UTC m=+2102.371677695" observedRunningTime="2025-12-03 22:40:14.792654597 +0000 UTC m=+2103.789115946" watchObservedRunningTime="2025-12-03 22:40:14.797829528 +0000 UTC m=+2103.794290877" Dec 03 22:40:16 crc kubenswrapper[4830]: I1203 22:40:16.690354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:16 crc kubenswrapper[4830]: I1203 22:40:16.690730 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:17 crc kubenswrapper[4830]: I1203 22:40:17.739640 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8m2xm" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="registry-server" probeResult="failure" output=< Dec 03 22:40:17 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 22:40:17 crc kubenswrapper[4830]: > Dec 03 22:40:18 crc kubenswrapper[4830]: I1203 22:40:18.813735 4830 generic.go:334] "Generic (PLEG): container finished" podID="4fc52ab9-68d4-4a61-a92f-8de75a563adc" containerID="7f2b30e230af682a0c7d1252a5d9b470fa6c2ca6dd6660cbdfae5a38304b7b55" exitCode=0 Dec 03 22:40:18 crc kubenswrapper[4830]: I1203 22:40:18.813839 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" event={"ID":"4fc52ab9-68d4-4a61-a92f-8de75a563adc","Type":"ContainerDied","Data":"7f2b30e230af682a0c7d1252a5d9b470fa6c2ca6dd6660cbdfae5a38304b7b55"} Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.414797 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.505230 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvd5\" (UniqueName: \"kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5\") pod \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.505686 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory\") pod \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.505854 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key\") pod \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\" (UID: \"4fc52ab9-68d4-4a61-a92f-8de75a563adc\") " Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.528731 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5" (OuterVolumeSpecName: "kube-api-access-lrvd5") pod "4fc52ab9-68d4-4a61-a92f-8de75a563adc" (UID: "4fc52ab9-68d4-4a61-a92f-8de75a563adc"). InnerVolumeSpecName "kube-api-access-lrvd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.547604 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory" (OuterVolumeSpecName: "inventory") pod "4fc52ab9-68d4-4a61-a92f-8de75a563adc" (UID: "4fc52ab9-68d4-4a61-a92f-8de75a563adc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.567456 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4fc52ab9-68d4-4a61-a92f-8de75a563adc" (UID: "4fc52ab9-68d4-4a61-a92f-8de75a563adc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.608425 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.608455 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc52ab9-68d4-4a61-a92f-8de75a563adc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.608465 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvd5\" (UniqueName: \"kubernetes.io/projected/4fc52ab9-68d4-4a61-a92f-8de75a563adc-kube-api-access-lrvd5\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.849080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" event={"ID":"4fc52ab9-68d4-4a61-a92f-8de75a563adc","Type":"ContainerDied","Data":"7a406612309813069dd5c9b38bf0aa14a55a7cf6cf839a6f2f4ee5fde595e902"} Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.849128 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a406612309813069dd5c9b38bf0aa14a55a7cf6cf839a6f2f4ee5fde595e902" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.849134 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kk4xf" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.930860 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4"] Dec 03 22:40:20 crc kubenswrapper[4830]: E1203 22:40:20.931387 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc52ab9-68d4-4a61-a92f-8de75a563adc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.931409 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc52ab9-68d4-4a61-a92f-8de75a563adc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.931686 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc52ab9-68d4-4a61-a92f-8de75a563adc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.932615 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.943270 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.943394 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.943742 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.944026 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:40:20 crc kubenswrapper[4830]: I1203 22:40:20.953052 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4"] Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.015963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgs9w\" (UniqueName: \"kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.016160 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.016336 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.119044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.119119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.119231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgs9w\" (UniqueName: \"kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.124895 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.125023 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.140427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgs9w\" (UniqueName: \"kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.262056 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.431771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.434870 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.443001 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.534637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.534807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlh5\" (UniqueName: \"kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.534862 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.636544 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlh5\" (UniqueName: \"kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.636619 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.636644 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.637108 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.637170 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.658689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlh5\" (UniqueName: \"kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5\") pod \"community-operators-t579x\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.780332 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:21 crc kubenswrapper[4830]: I1203 22:40:21.959472 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4"] Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.400446 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:22 crc kubenswrapper[4830]: W1203 22:40:22.407205 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae03aa7f_1af9_47fb_8e93_ee91a52f6786.slice/crio-5a44ef1bfc204cb11f3d9a9b2e2c4e550fa0d0b98adf7134c3d7d4556cd74df9 WatchSource:0}: Error finding container 5a44ef1bfc204cb11f3d9a9b2e2c4e550fa0d0b98adf7134c3d7d4556cd74df9: Status 404 returned error can't find the container with id 5a44ef1bfc204cb11f3d9a9b2e2c4e550fa0d0b98adf7134c3d7d4556cd74df9 Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.870823 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerID="c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2" exitCode=0 Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.870893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerDied","Data":"c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2"} Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.871130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerStarted","Data":"5a44ef1bfc204cb11f3d9a9b2e2c4e550fa0d0b98adf7134c3d7d4556cd74df9"} Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.874953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" event={"ID":"6145c52f-1582-4092-a44a-cc665216b2af","Type":"ContainerStarted","Data":"e1e553938f5342eebe22ff8a230365618ab0de1432a33b80467b66652282de49"} Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.875002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" event={"ID":"6145c52f-1582-4092-a44a-cc665216b2af","Type":"ContainerStarted","Data":"fcac7dab91ee18c7132416e5741199b971b46f77d09e7c662e48cf39bfd474ce"} Dec 03 22:40:22 crc kubenswrapper[4830]: I1203 22:40:22.908021 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" podStartSLOduration=2.51334132 podStartE2EDuration="2.908001995s" podCreationTimestamp="2025-12-03 22:40:20 +0000 UTC" firstStartedPulling="2025-12-03 22:40:21.962638523 +0000 UTC m=+2110.959099872" lastFinishedPulling="2025-12-03 22:40:22.357299198 +0000 UTC m=+2111.353760547" observedRunningTime="2025-12-03 22:40:22.904574352 +0000 UTC m=+2111.901035741" watchObservedRunningTime="2025-12-03 22:40:22.908001995 +0000 UTC m=+2111.904463364" Dec 03 22:40:23 crc kubenswrapper[4830]: I1203 22:40:23.754601 4830 scope.go:117] "RemoveContainer" containerID="13eaa48ce58652a6ac20ba63776fb0bfcc6ea09b3d81eef56912ec00cf879dd3" Dec 03 22:40:24 crc kubenswrapper[4830]: I1203 22:40:24.899960 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerStarted","Data":"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502"} Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.681639 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.681974 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.735215 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.785195 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.920440 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerID="8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502" exitCode=0 Dec 03 22:40:26 crc kubenswrapper[4830]: I1203 22:40:26.920538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerDied","Data":"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502"} Dec 03 22:40:27 crc kubenswrapper[4830]: I1203 22:40:27.409600 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:27 crc kubenswrapper[4830]: I1203 22:40:27.934104 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerStarted","Data":"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906"} Dec 03 22:40:27 crc kubenswrapper[4830]: I1203 22:40:27.934205 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8m2xm" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="registry-server" containerID="cri-o://fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69" gracePeriod=2 Dec 03 22:40:27 crc kubenswrapper[4830]: I1203 22:40:27.959043 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t579x" podStartSLOduration=2.500535463 podStartE2EDuration="6.959023173s" podCreationTimestamp="2025-12-03 22:40:21 +0000 UTC" firstStartedPulling="2025-12-03 22:40:22.874071537 +0000 UTC m=+2111.870532886" lastFinishedPulling="2025-12-03 22:40:27.332559247 +0000 UTC m=+2116.329020596" observedRunningTime="2025-12-03 22:40:27.953633277 +0000 UTC m=+2116.950094626" watchObservedRunningTime="2025-12-03 22:40:27.959023173 +0000 UTC m=+2116.955484522" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.505483 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.591451 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities\") pod \"85719986-300c-4c54-8c62-997c584c27dd\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.591633 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content\") pod \"85719986-300c-4c54-8c62-997c584c27dd\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.591729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79nqn\" (UniqueName: \"kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn\") pod \"85719986-300c-4c54-8c62-997c584c27dd\" (UID: \"85719986-300c-4c54-8c62-997c584c27dd\") " Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.592375 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities" (OuterVolumeSpecName: "utilities") pod "85719986-300c-4c54-8c62-997c584c27dd" (UID: "85719986-300c-4c54-8c62-997c584c27dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.597500 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn" (OuterVolumeSpecName: "kube-api-access-79nqn") pod "85719986-300c-4c54-8c62-997c584c27dd" (UID: "85719986-300c-4c54-8c62-997c584c27dd"). InnerVolumeSpecName "kube-api-access-79nqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.694415 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79nqn\" (UniqueName: \"kubernetes.io/projected/85719986-300c-4c54-8c62-997c584c27dd-kube-api-access-79nqn\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.694453 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.713672 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85719986-300c-4c54-8c62-997c584c27dd" (UID: "85719986-300c-4c54-8c62-997c584c27dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.796247 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85719986-300c-4c54-8c62-997c584c27dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.945820 4830 generic.go:334] "Generic (PLEG): container finished" podID="85719986-300c-4c54-8c62-997c584c27dd" containerID="fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69" exitCode=0 Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.945865 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerDied","Data":"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69"} Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.945893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m2xm" event={"ID":"85719986-300c-4c54-8c62-997c584c27dd","Type":"ContainerDied","Data":"235b4a0abe30e0dd9d387a00ee37ee8ad4bd2de9b1e80b25ba27f4b90d46a665"} Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.945898 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m2xm" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.945911 4830 scope.go:117] "RemoveContainer" containerID="fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69" Dec 03 22:40:28 crc kubenswrapper[4830]: I1203 22:40:28.987643 4830 scope.go:117] "RemoveContainer" containerID="6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.012745 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.032139 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8m2xm"] Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.037682 4830 scope.go:117] "RemoveContainer" containerID="761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.124975 4830 scope.go:117] "RemoveContainer" containerID="fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69" Dec 03 22:40:29 crc kubenswrapper[4830]: E1203 22:40:29.125735 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69\": container with ID starting with fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69 not found: ID does not exist" containerID="fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.125775 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69"} err="failed to get container status \"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69\": rpc error: code = NotFound desc = could not find container \"fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69\": container with ID starting with fcbf31b03d52fff16feeca5adc7d4eb7dff63f597301f9edc5f9676e3b857b69 not found: ID does not exist" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.125800 4830 scope.go:117] "RemoveContainer" containerID="6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb" Dec 03 22:40:29 crc kubenswrapper[4830]: E1203 22:40:29.130408 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb\": container with ID starting with 6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb not found: ID does not exist" containerID="6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.130463 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb"} err="failed to get container status \"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb\": rpc error: code = NotFound desc = could not find container \"6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb\": container with ID starting with 6d808cacbaa096ad17b191d544f33d671dc43f673200566f72d7c92787f01fcb not found: ID does not exist" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.130494 4830 scope.go:117] "RemoveContainer" containerID="761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163" Dec 03 22:40:29 crc kubenswrapper[4830]: E1203 22:40:29.134668 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163\": container with ID starting with 761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163 not found: ID does not exist" containerID="761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.134718 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163"} err="failed to get container status \"761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163\": rpc error: code = NotFound desc = could not find container \"761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163\": container with ID starting with 761fbc108a9e06f5a85eb5fb042fa1893e6da76c8d910debdbbb76ca01f92163 not found: ID does not exist" Dec 03 22:40:29 crc kubenswrapper[4830]: I1203 22:40:29.352026 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85719986-300c-4c54-8c62-997c584c27dd" path="/var/lib/kubelet/pods/85719986-300c-4c54-8c62-997c584c27dd/volumes" Dec 03 22:40:31 crc kubenswrapper[4830]: I1203 22:40:31.780615 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:31 crc kubenswrapper[4830]: I1203 22:40:31.780992 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:31 crc kubenswrapper[4830]: I1203 22:40:31.826982 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:41 crc kubenswrapper[4830]: I1203 22:40:41.829965 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:41 crc kubenswrapper[4830]: I1203 22:40:41.900916 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.120970 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t579x" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="registry-server" containerID="cri-o://66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906" gracePeriod=2 Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.610024 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.707698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities\") pod \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.707752 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content\") pod \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.708023 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mlh5\" (UniqueName: \"kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5\") pod \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\" (UID: \"ae03aa7f-1af9-47fb-8e93-ee91a52f6786\") " Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.708720 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities" (OuterVolumeSpecName: "utilities") pod "ae03aa7f-1af9-47fb-8e93-ee91a52f6786" (UID: "ae03aa7f-1af9-47fb-8e93-ee91a52f6786"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.714132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5" (OuterVolumeSpecName: "kube-api-access-7mlh5") pod "ae03aa7f-1af9-47fb-8e93-ee91a52f6786" (UID: "ae03aa7f-1af9-47fb-8e93-ee91a52f6786"). InnerVolumeSpecName "kube-api-access-7mlh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.762242 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae03aa7f-1af9-47fb-8e93-ee91a52f6786" (UID: "ae03aa7f-1af9-47fb-8e93-ee91a52f6786"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.810547 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.810921 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:42 crc kubenswrapper[4830]: I1203 22:40:42.810941 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mlh5\" (UniqueName: \"kubernetes.io/projected/ae03aa7f-1af9-47fb-8e93-ee91a52f6786-kube-api-access-7mlh5\") on node \"crc\" DevicePath \"\"" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.130743 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerID="66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906" exitCode=0 Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.130782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerDied","Data":"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906"} Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.130832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t579x" event={"ID":"ae03aa7f-1af9-47fb-8e93-ee91a52f6786","Type":"ContainerDied","Data":"5a44ef1bfc204cb11f3d9a9b2e2c4e550fa0d0b98adf7134c3d7d4556cd74df9"} Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.130849 4830 scope.go:117] "RemoveContainer" containerID="66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.131658 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t579x" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.152760 4830 scope.go:117] "RemoveContainer" containerID="8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.167894 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.177027 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t579x"] Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.189494 4830 scope.go:117] "RemoveContainer" containerID="c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.224782 4830 scope.go:117] "RemoveContainer" containerID="66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906" Dec 03 22:40:43 crc kubenswrapper[4830]: E1203 22:40:43.225351 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906\": container with ID starting with 66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906 not found: ID does not exist" containerID="66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.225404 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906"} err="failed to get container status \"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906\": rpc error: code = NotFound desc = could not find container \"66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906\": container with ID starting with 66a9211106d9f7db96d79508d66a838002769e0ff4158330806ffa7e83612906 not found: ID does not exist" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.225441 4830 scope.go:117] "RemoveContainer" containerID="8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502" Dec 03 22:40:43 crc kubenswrapper[4830]: E1203 22:40:43.225914 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502\": container with ID starting with 8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502 not found: ID does not exist" containerID="8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.225946 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502"} err="failed to get container status \"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502\": rpc error: code = NotFound desc = could not find container \"8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502\": container with ID starting with 8814d296360afc595edaccb2d8460832769f50900b53e7f72ba928728ae7e502 not found: ID does not exist" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.225968 4830 scope.go:117] "RemoveContainer" containerID="c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2" Dec 03 22:40:43 crc kubenswrapper[4830]: E1203 22:40:43.226303 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2\": container with ID starting with c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2 not found: ID does not exist" containerID="c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.226357 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2"} err="failed to get container status \"c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2\": rpc error: code = NotFound desc = could not find container \"c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2\": container with ID starting with c2267f6069fa8801845a4042d1fe9e38d01c8c750864c2adcd67dc9aeeb7aee2 not found: ID does not exist" Dec 03 22:40:43 crc kubenswrapper[4830]: I1203 22:40:43.351832 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" path="/var/lib/kubelet/pods/ae03aa7f-1af9-47fb-8e93-ee91a52f6786/volumes" Dec 03 22:40:56 crc kubenswrapper[4830]: I1203 22:40:56.681635 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:40:56 crc kubenswrapper[4830]: I1203 22:40:56.682132 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:41:13 crc kubenswrapper[4830]: I1203 22:41:13.419977 4830 generic.go:334] "Generic (PLEG): container finished" podID="6145c52f-1582-4092-a44a-cc665216b2af" containerID="e1e553938f5342eebe22ff8a230365618ab0de1432a33b80467b66652282de49" exitCode=0 Dec 03 22:41:13 crc kubenswrapper[4830]: I1203 22:41:13.420072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" event={"ID":"6145c52f-1582-4092-a44a-cc665216b2af","Type":"ContainerDied","Data":"e1e553938f5342eebe22ff8a230365618ab0de1432a33b80467b66652282de49"} Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.079024 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.222964 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key\") pod \"6145c52f-1582-4092-a44a-cc665216b2af\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.223135 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgs9w\" (UniqueName: \"kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w\") pod \"6145c52f-1582-4092-a44a-cc665216b2af\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.223308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory\") pod \"6145c52f-1582-4092-a44a-cc665216b2af\" (UID: \"6145c52f-1582-4092-a44a-cc665216b2af\") " Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.230766 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w" (OuterVolumeSpecName: "kube-api-access-vgs9w") pod "6145c52f-1582-4092-a44a-cc665216b2af" (UID: "6145c52f-1582-4092-a44a-cc665216b2af"). InnerVolumeSpecName "kube-api-access-vgs9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.259645 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6145c52f-1582-4092-a44a-cc665216b2af" (UID: "6145c52f-1582-4092-a44a-cc665216b2af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.262498 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory" (OuterVolumeSpecName: "inventory") pod "6145c52f-1582-4092-a44a-cc665216b2af" (UID: "6145c52f-1582-4092-a44a-cc665216b2af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.326839 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgs9w\" (UniqueName: \"kubernetes.io/projected/6145c52f-1582-4092-a44a-cc665216b2af-kube-api-access-vgs9w\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.326868 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.326878 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6145c52f-1582-4092-a44a-cc665216b2af-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.448137 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" event={"ID":"6145c52f-1582-4092-a44a-cc665216b2af","Type":"ContainerDied","Data":"fcac7dab91ee18c7132416e5741199b971b46f77d09e7c662e48cf39bfd474ce"} Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.448175 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcac7dab91ee18c7132416e5741199b971b46f77d09e7c662e48cf39bfd474ce" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.448195 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.538881 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wlhvr"] Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539308 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6145c52f-1582-4092-a44a-cc665216b2af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539326 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6145c52f-1582-4092-a44a-cc665216b2af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539344 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="extract-utilities" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539352 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="extract-utilities" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539367 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="extract-utilities" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539373 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="extract-utilities" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539384 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="extract-content" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539389 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="extract-content" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539406 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539413 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539427 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="extract-content" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539433 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="extract-content" Dec 03 22:41:15 crc kubenswrapper[4830]: E1203 22:41:15.539446 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539453 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539675 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="85719986-300c-4c54-8c62-997c584c27dd" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539692 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae03aa7f-1af9-47fb-8e93-ee91a52f6786" containerName="registry-server" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.539708 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6145c52f-1582-4092-a44a-cc665216b2af" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.540453 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.544422 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.544727 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.544914 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.544967 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.553164 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wlhvr"] Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.632379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgkp\" (UniqueName: \"kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.632826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.632963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.734586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.734652 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.734729 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgkp\" (UniqueName: \"kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.739344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.753488 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgkp\" (UniqueName: \"kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.754555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wlhvr\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:15 crc kubenswrapper[4830]: I1203 22:41:15.860152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:16 crc kubenswrapper[4830]: I1203 22:41:16.578585 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wlhvr"] Dec 03 22:41:17 crc kubenswrapper[4830]: I1203 22:41:17.479772 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" event={"ID":"79dc5285-4e46-46d1-a708-a1f1623b7448","Type":"ContainerStarted","Data":"9e08534d3eb97240409c450d2a80eee24c6bb721d89ffd05f56eac5b6cdb0a61"} Dec 03 22:41:17 crc kubenswrapper[4830]: I1203 22:41:17.480376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" event={"ID":"79dc5285-4e46-46d1-a708-a1f1623b7448","Type":"ContainerStarted","Data":"906818f71088953ea6ebb259f5db9bcc94b91082c2b845d855f41e8756931614"} Dec 03 22:41:18 crc kubenswrapper[4830]: I1203 22:41:18.514635 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" podStartSLOduration=2.909777833 podStartE2EDuration="3.514615254s" podCreationTimestamp="2025-12-03 22:41:15 +0000 UTC" firstStartedPulling="2025-12-03 22:41:16.588351549 +0000 UTC m=+2165.584812898" lastFinishedPulling="2025-12-03 22:41:17.19318896 +0000 UTC m=+2166.189650319" observedRunningTime="2025-12-03 22:41:18.509594928 +0000 UTC m=+2167.506056287" watchObservedRunningTime="2025-12-03 22:41:18.514615254 +0000 UTC m=+2167.511076603" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.468405 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.474141 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.487982 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.530624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.530801 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.530842 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg6hs\" (UniqueName: \"kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.632814 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.632943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.632978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg6hs\" (UniqueName: \"kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.633376 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.633644 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.658482 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg6hs\" (UniqueName: \"kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs\") pod \"certified-operators-9tcts\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:19 crc kubenswrapper[4830]: I1203 22:41:19.836162 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:20 crc kubenswrapper[4830]: I1203 22:41:20.410521 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:20 crc kubenswrapper[4830]: I1203 22:41:20.521926 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerStarted","Data":"c0ee10b111196f77e8ee13582d3b0eb119185b416b4edd4c81bfe2c75efe5447"} Dec 03 22:41:21 crc kubenswrapper[4830]: I1203 22:41:21.535158 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1293ebb-6f43-4250-b51b-39941694748b" containerID="749b4e5a8d3edf2b7df274402af3120973aa82be0c84199f9530366d53650cf9" exitCode=0 Dec 03 22:41:21 crc kubenswrapper[4830]: I1203 22:41:21.535273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerDied","Data":"749b4e5a8d3edf2b7df274402af3120973aa82be0c84199f9530366d53650cf9"} Dec 03 22:41:22 crc kubenswrapper[4830]: I1203 22:41:22.547343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerStarted","Data":"e93e56a4c51617ee56ce0c1d2eeb747cdbd3d440162bf049fdc3c154f97f7d84"} Dec 03 22:41:23 crc kubenswrapper[4830]: I1203 22:41:23.558103 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1293ebb-6f43-4250-b51b-39941694748b" containerID="e93e56a4c51617ee56ce0c1d2eeb747cdbd3d440162bf049fdc3c154f97f7d84" exitCode=0 Dec 03 22:41:23 crc kubenswrapper[4830]: I1203 22:41:23.558226 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerDied","Data":"e93e56a4c51617ee56ce0c1d2eeb747cdbd3d440162bf049fdc3c154f97f7d84"} Dec 03 22:41:24 crc kubenswrapper[4830]: I1203 22:41:24.569733 4830 generic.go:334] "Generic (PLEG): container finished" podID="79dc5285-4e46-46d1-a708-a1f1623b7448" containerID="9e08534d3eb97240409c450d2a80eee24c6bb721d89ffd05f56eac5b6cdb0a61" exitCode=0 Dec 03 22:41:24 crc kubenswrapper[4830]: I1203 22:41:24.569893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" event={"ID":"79dc5285-4e46-46d1-a708-a1f1623b7448","Type":"ContainerDied","Data":"9e08534d3eb97240409c450d2a80eee24c6bb721d89ffd05f56eac5b6cdb0a61"} Dec 03 22:41:25 crc kubenswrapper[4830]: I1203 22:41:25.581935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerStarted","Data":"d555cf33bfae9ec2e51beb29a7b121c729ff3e1524738ed29154bc88052699c7"} Dec 03 22:41:25 crc kubenswrapper[4830]: I1203 22:41:25.614270 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9tcts" podStartSLOduration=3.852480542 podStartE2EDuration="6.614250217s" podCreationTimestamp="2025-12-03 22:41:19 +0000 UTC" firstStartedPulling="2025-12-03 22:41:21.538274683 +0000 UTC m=+2170.534736032" lastFinishedPulling="2025-12-03 22:41:24.300044358 +0000 UTC m=+2173.296505707" observedRunningTime="2025-12-03 22:41:25.601295466 +0000 UTC m=+2174.597756815" watchObservedRunningTime="2025-12-03 22:41:25.614250217 +0000 UTC m=+2174.610711576" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.139452 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.297541 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0\") pod \"79dc5285-4e46-46d1-a708-a1f1623b7448\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.297667 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgkp\" (UniqueName: \"kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp\") pod \"79dc5285-4e46-46d1-a708-a1f1623b7448\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.297850 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam\") pod \"79dc5285-4e46-46d1-a708-a1f1623b7448\" (UID: \"79dc5285-4e46-46d1-a708-a1f1623b7448\") " Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.302378 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp" (OuterVolumeSpecName: "kube-api-access-2sgkp") pod "79dc5285-4e46-46d1-a708-a1f1623b7448" (UID: "79dc5285-4e46-46d1-a708-a1f1623b7448"). InnerVolumeSpecName "kube-api-access-2sgkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.334119 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "79dc5285-4e46-46d1-a708-a1f1623b7448" (UID: "79dc5285-4e46-46d1-a708-a1f1623b7448"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.334710 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "79dc5285-4e46-46d1-a708-a1f1623b7448" (UID: "79dc5285-4e46-46d1-a708-a1f1623b7448"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.401214 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.401247 4830 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79dc5285-4e46-46d1-a708-a1f1623b7448-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.401258 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sgkp\" (UniqueName: \"kubernetes.io/projected/79dc5285-4e46-46d1-a708-a1f1623b7448-kube-api-access-2sgkp\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.594724 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.594729 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wlhvr" event={"ID":"79dc5285-4e46-46d1-a708-a1f1623b7448","Type":"ContainerDied","Data":"906818f71088953ea6ebb259f5db9bcc94b91082c2b845d855f41e8756931614"} Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.594786 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="906818f71088953ea6ebb259f5db9bcc94b91082c2b845d855f41e8756931614" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.666262 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9"] Dec 03 22:41:26 crc kubenswrapper[4830]: E1203 22:41:26.666788 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dc5285-4e46-46d1-a708-a1f1623b7448" containerName="ssh-known-hosts-edpm-deployment" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.666813 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dc5285-4e46-46d1-a708-a1f1623b7448" containerName="ssh-known-hosts-edpm-deployment" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.667090 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dc5285-4e46-46d1-a708-a1f1623b7448" containerName="ssh-known-hosts-edpm-deployment" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.668003 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.672593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.672927 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.672593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.673253 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.680980 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.681053 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.681102 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.681874 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.681931 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e" gracePeriod=600 Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.690492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9"] Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.809587 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thb4j\" (UniqueName: \"kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.810591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.810900 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.916635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.917163 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.917448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thb4j\" (UniqueName: \"kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.924168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.924258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.935233 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thb4j\" (UniqueName: \"kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zhhk9\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:26 crc kubenswrapper[4830]: I1203 22:41:26.994270 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:27 crc kubenswrapper[4830]: I1203 22:41:27.606027 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e" exitCode=0 Dec 03 22:41:27 crc kubenswrapper[4830]: I1203 22:41:27.606141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e"} Dec 03 22:41:27 crc kubenswrapper[4830]: I1203 22:41:27.606811 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457"} Dec 03 22:41:27 crc kubenswrapper[4830]: I1203 22:41:27.606840 4830 scope.go:117] "RemoveContainer" containerID="42be82899c37694e1ce241b88697b9d3655b398a66a1ee7f0c596abfe2f16645" Dec 03 22:41:27 crc kubenswrapper[4830]: I1203 22:41:27.653492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9"] Dec 03 22:41:28 crc kubenswrapper[4830]: I1203 22:41:28.620918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" event={"ID":"922398e7-7669-44c9-89e4-cf9cfea422c8","Type":"ContainerStarted","Data":"f92a230be6e3ffc92c60dd8c71b50fefe3a19731cbe6bd3286b04051d9cde77f"} Dec 03 22:41:29 crc kubenswrapper[4830]: I1203 22:41:29.836539 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:29 crc kubenswrapper[4830]: I1203 22:41:29.837145 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:29 crc kubenswrapper[4830]: I1203 22:41:29.895305 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:30 crc kubenswrapper[4830]: I1203 22:41:30.658340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" event={"ID":"922398e7-7669-44c9-89e4-cf9cfea422c8","Type":"ContainerStarted","Data":"c98def0cf5fc3379168af0a64c5e4571579f9d83e9c846e69e620ed99497fa21"} Dec 03 22:41:30 crc kubenswrapper[4830]: I1203 22:41:30.679239 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" podStartSLOduration=2.928582168 podStartE2EDuration="4.679223262s" podCreationTimestamp="2025-12-03 22:41:26 +0000 UTC" firstStartedPulling="2025-12-03 22:41:27.651666528 +0000 UTC m=+2176.648127897" lastFinishedPulling="2025-12-03 22:41:29.402307632 +0000 UTC m=+2178.398768991" observedRunningTime="2025-12-03 22:41:30.674573406 +0000 UTC m=+2179.671034755" watchObservedRunningTime="2025-12-03 22:41:30.679223262 +0000 UTC m=+2179.675684611" Dec 03 22:41:30 crc kubenswrapper[4830]: I1203 22:41:30.717895 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:30 crc kubenswrapper[4830]: I1203 22:41:30.788401 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:32 crc kubenswrapper[4830]: I1203 22:41:32.673983 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9tcts" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="registry-server" containerID="cri-o://d555cf33bfae9ec2e51beb29a7b121c729ff3e1524738ed29154bc88052699c7" gracePeriod=2 Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.685024 4830 generic.go:334] "Generic (PLEG): container finished" podID="b1293ebb-6f43-4250-b51b-39941694748b" containerID="d555cf33bfae9ec2e51beb29a7b121c729ff3e1524738ed29154bc88052699c7" exitCode=0 Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.685095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerDied","Data":"d555cf33bfae9ec2e51beb29a7b121c729ff3e1524738ed29154bc88052699c7"} Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.835925 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.867053 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content\") pod \"b1293ebb-6f43-4250-b51b-39941694748b\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.867359 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg6hs\" (UniqueName: \"kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs\") pod \"b1293ebb-6f43-4250-b51b-39941694748b\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.867412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities\") pod \"b1293ebb-6f43-4250-b51b-39941694748b\" (UID: \"b1293ebb-6f43-4250-b51b-39941694748b\") " Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.869392 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities" (OuterVolumeSpecName: "utilities") pod "b1293ebb-6f43-4250-b51b-39941694748b" (UID: "b1293ebb-6f43-4250-b51b-39941694748b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.875589 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs" (OuterVolumeSpecName: "kube-api-access-mg6hs") pod "b1293ebb-6f43-4250-b51b-39941694748b" (UID: "b1293ebb-6f43-4250-b51b-39941694748b"). InnerVolumeSpecName "kube-api-access-mg6hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.970047 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg6hs\" (UniqueName: \"kubernetes.io/projected/b1293ebb-6f43-4250-b51b-39941694748b-kube-api-access-mg6hs\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:33 crc kubenswrapper[4830]: I1203 22:41:33.970083 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.045200 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-4v4rm"] Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.054735 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-4v4rm"] Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.722645 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tcts" event={"ID":"b1293ebb-6f43-4250-b51b-39941694748b","Type":"ContainerDied","Data":"c0ee10b111196f77e8ee13582d3b0eb119185b416b4edd4c81bfe2c75efe5447"} Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.722982 4830 scope.go:117] "RemoveContainer" containerID="d555cf33bfae9ec2e51beb29a7b121c729ff3e1524738ed29154bc88052699c7" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.722784 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tcts" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.750482 4830 scope.go:117] "RemoveContainer" containerID="e93e56a4c51617ee56ce0c1d2eeb747cdbd3d440162bf049fdc3c154f97f7d84" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.777804 4830 scope.go:117] "RemoveContainer" containerID="749b4e5a8d3edf2b7df274402af3120973aa82be0c84199f9530366d53650cf9" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.899017 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1293ebb-6f43-4250-b51b-39941694748b" (UID: "b1293ebb-6f43-4250-b51b-39941694748b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:41:34 crc kubenswrapper[4830]: I1203 22:41:34.991661 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1293ebb-6f43-4250-b51b-39941694748b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:35 crc kubenswrapper[4830]: I1203 22:41:35.056607 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:35 crc kubenswrapper[4830]: I1203 22:41:35.066018 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9tcts"] Dec 03 22:41:35 crc kubenswrapper[4830]: I1203 22:41:35.364901 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6" path="/var/lib/kubelet/pods/51d8093a-6e10-47ac-a2e9-20a4d3f5f0c6/volumes" Dec 03 22:41:35 crc kubenswrapper[4830]: I1203 22:41:35.366374 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1293ebb-6f43-4250-b51b-39941694748b" path="/var/lib/kubelet/pods/b1293ebb-6f43-4250-b51b-39941694748b/volumes" Dec 03 22:41:38 crc kubenswrapper[4830]: I1203 22:41:38.771416 4830 generic.go:334] "Generic (PLEG): container finished" podID="922398e7-7669-44c9-89e4-cf9cfea422c8" containerID="c98def0cf5fc3379168af0a64c5e4571579f9d83e9c846e69e620ed99497fa21" exitCode=0 Dec 03 22:41:38 crc kubenswrapper[4830]: I1203 22:41:38.771561 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" event={"ID":"922398e7-7669-44c9-89e4-cf9cfea422c8","Type":"ContainerDied","Data":"c98def0cf5fc3379168af0a64c5e4571579f9d83e9c846e69e620ed99497fa21"} Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.034245 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-69q52"] Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.050423 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-69q52"] Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.413340 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.515922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thb4j\" (UniqueName: \"kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j\") pod \"922398e7-7669-44c9-89e4-cf9cfea422c8\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.516302 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key\") pod \"922398e7-7669-44c9-89e4-cf9cfea422c8\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.516450 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory\") pod \"922398e7-7669-44c9-89e4-cf9cfea422c8\" (UID: \"922398e7-7669-44c9-89e4-cf9cfea422c8\") " Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.526701 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j" (OuterVolumeSpecName: "kube-api-access-thb4j") pod "922398e7-7669-44c9-89e4-cf9cfea422c8" (UID: "922398e7-7669-44c9-89e4-cf9cfea422c8"). InnerVolumeSpecName "kube-api-access-thb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.546441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory" (OuterVolumeSpecName: "inventory") pod "922398e7-7669-44c9-89e4-cf9cfea422c8" (UID: "922398e7-7669-44c9-89e4-cf9cfea422c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.548865 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "922398e7-7669-44c9-89e4-cf9cfea422c8" (UID: "922398e7-7669-44c9-89e4-cf9cfea422c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.619162 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thb4j\" (UniqueName: \"kubernetes.io/projected/922398e7-7669-44c9-89e4-cf9cfea422c8-kube-api-access-thb4j\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.619197 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.619213 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/922398e7-7669-44c9-89e4-cf9cfea422c8-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.825356 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" event={"ID":"922398e7-7669-44c9-89e4-cf9cfea422c8","Type":"ContainerDied","Data":"f92a230be6e3ffc92c60dd8c71b50fefe3a19731cbe6bd3286b04051d9cde77f"} Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.825661 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92a230be6e3ffc92c60dd8c71b50fefe3a19731cbe6bd3286b04051d9cde77f" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.825427 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zhhk9" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.874302 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd"] Dec 03 22:41:40 crc kubenswrapper[4830]: E1203 22:41:40.875065 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="extract-utilities" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875085 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="extract-utilities" Dec 03 22:41:40 crc kubenswrapper[4830]: E1203 22:41:40.875099 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="extract-content" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875106 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="extract-content" Dec 03 22:41:40 crc kubenswrapper[4830]: E1203 22:41:40.875132 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="registry-server" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875138 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="registry-server" Dec 03 22:41:40 crc kubenswrapper[4830]: E1203 22:41:40.875153 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922398e7-7669-44c9-89e4-cf9cfea422c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875161 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="922398e7-7669-44c9-89e4-cf9cfea422c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875407 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="922398e7-7669-44c9-89e4-cf9cfea422c8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.875422 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1293ebb-6f43-4250-b51b-39941694748b" containerName="registry-server" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.876585 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.880072 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.884646 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.884685 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd"] Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.884942 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.884988 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.924247 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.924437 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjxt\" (UniqueName: \"kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:40 crc kubenswrapper[4830]: I1203 22:41:40.924490 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.025965 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjxt\" (UniqueName: \"kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.026042 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.026095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.031714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.032565 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.044110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjxt\" (UniqueName: \"kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.209577 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.356317 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f517ffb5-a85a-4683-b721-ef130df773dc" path="/var/lib/kubelet/pods/f517ffb5-a85a-4683-b721-ef130df773dc/volumes" Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.781616 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd"] Dec 03 22:41:41 crc kubenswrapper[4830]: I1203 22:41:41.834187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" event={"ID":"2153e26c-6b48-4353-9ee3-ac526f0d76b2","Type":"ContainerStarted","Data":"cb7368f8f86189e0b38a387146042d05fad031cd09d2ee0c156c9b2ea08a8a91"} Dec 03 22:41:42 crc kubenswrapper[4830]: I1203 22:41:42.844241 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" event={"ID":"2153e26c-6b48-4353-9ee3-ac526f0d76b2","Type":"ContainerStarted","Data":"5045eeb9e3a0374ad52fbe0e40da1605ac4cc4398ff2c57db5e00227772f0c1e"} Dec 03 22:41:42 crc kubenswrapper[4830]: I1203 22:41:42.877215 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" podStartSLOduration=2.410452027 podStartE2EDuration="2.877191733s" podCreationTimestamp="2025-12-03 22:41:40 +0000 UTC" firstStartedPulling="2025-12-03 22:41:41.779720806 +0000 UTC m=+2190.776182155" lastFinishedPulling="2025-12-03 22:41:42.246460512 +0000 UTC m=+2191.242921861" observedRunningTime="2025-12-03 22:41:42.860893442 +0000 UTC m=+2191.857354791" watchObservedRunningTime="2025-12-03 22:41:42.877191733 +0000 UTC m=+2191.873653092" Dec 03 22:41:51 crc kubenswrapper[4830]: I1203 22:41:51.928350 4830 generic.go:334] "Generic (PLEG): container finished" podID="2153e26c-6b48-4353-9ee3-ac526f0d76b2" containerID="5045eeb9e3a0374ad52fbe0e40da1605ac4cc4398ff2c57db5e00227772f0c1e" exitCode=0 Dec 03 22:41:51 crc kubenswrapper[4830]: I1203 22:41:51.928460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" event={"ID":"2153e26c-6b48-4353-9ee3-ac526f0d76b2","Type":"ContainerDied","Data":"5045eeb9e3a0374ad52fbe0e40da1605ac4cc4398ff2c57db5e00227772f0c1e"} Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.483808 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.602567 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjxt\" (UniqueName: \"kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt\") pod \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.602666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory\") pod \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.602870 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key\") pod \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\" (UID: \"2153e26c-6b48-4353-9ee3-ac526f0d76b2\") " Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.609361 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt" (OuterVolumeSpecName: "kube-api-access-9jjxt") pod "2153e26c-6b48-4353-9ee3-ac526f0d76b2" (UID: "2153e26c-6b48-4353-9ee3-ac526f0d76b2"). InnerVolumeSpecName "kube-api-access-9jjxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.636146 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory" (OuterVolumeSpecName: "inventory") pod "2153e26c-6b48-4353-9ee3-ac526f0d76b2" (UID: "2153e26c-6b48-4353-9ee3-ac526f0d76b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.642033 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2153e26c-6b48-4353-9ee3-ac526f0d76b2" (UID: "2153e26c-6b48-4353-9ee3-ac526f0d76b2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.706930 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjxt\" (UniqueName: \"kubernetes.io/projected/2153e26c-6b48-4353-9ee3-ac526f0d76b2-kube-api-access-9jjxt\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.706973 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.706985 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2153e26c-6b48-4353-9ee3-ac526f0d76b2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.955974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" event={"ID":"2153e26c-6b48-4353-9ee3-ac526f0d76b2","Type":"ContainerDied","Data":"cb7368f8f86189e0b38a387146042d05fad031cd09d2ee0c156c9b2ea08a8a91"} Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.956013 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7368f8f86189e0b38a387146042d05fad031cd09d2ee0c156c9b2ea08a8a91" Dec 03 22:41:53 crc kubenswrapper[4830]: I1203 22:41:53.956058 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.067327 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs"] Dec 03 22:41:54 crc kubenswrapper[4830]: E1203 22:41:54.067743 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153e26c-6b48-4353-9ee3-ac526f0d76b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.067762 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153e26c-6b48-4353-9ee3-ac526f0d76b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.067967 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153e26c-6b48-4353-9ee3-ac526f0d76b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.068767 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.073114 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.073340 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.073580 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.073735 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.073886 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.074040 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.074232 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.074401 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.083875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs"] Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217341 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217404 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217462 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217502 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217542 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217614 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217663 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217710 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217794 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bxc\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217879 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217909 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.217927 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.319496 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.319866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.319904 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.319930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.319978 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320015 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320066 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320238 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320264 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.320402 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bxc\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.324589 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.327186 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.328810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.329168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.329912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.330337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.330427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.331010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.331162 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.331630 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.333878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.337029 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.337886 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.338750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bxc\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.388760 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.949167 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs"] Dec 03 22:41:54 crc kubenswrapper[4830]: I1203 22:41:54.967499 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" event={"ID":"1c532c81-40fc-4058-bb22-abec161c538a","Type":"ContainerStarted","Data":"5685743d236c8a306599df77cf36157b971df6843bfd97d3885babc58ba15733"} Dec 03 22:41:55 crc kubenswrapper[4830]: I1203 22:41:55.999115 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" event={"ID":"1c532c81-40fc-4058-bb22-abec161c538a","Type":"ContainerStarted","Data":"256a8033e1269238a9bf56a64e6c494ef6206b527bebb6fe6102eeaf1ab6c33b"} Dec 03 22:41:56 crc kubenswrapper[4830]: I1203 22:41:56.026261 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" podStartSLOduration=1.458046661 podStartE2EDuration="2.02624462s" podCreationTimestamp="2025-12-03 22:41:54 +0000 UTC" firstStartedPulling="2025-12-03 22:41:54.95190752 +0000 UTC m=+2203.948368869" lastFinishedPulling="2025-12-03 22:41:55.520105479 +0000 UTC m=+2204.516566828" observedRunningTime="2025-12-03 22:41:56.024940325 +0000 UTC m=+2205.021401674" watchObservedRunningTime="2025-12-03 22:41:56.02624462 +0000 UTC m=+2205.022705969" Dec 03 22:42:23 crc kubenswrapper[4830]: I1203 22:42:23.878998 4830 scope.go:117] "RemoveContainer" containerID="f5f2d638b620b0ad73387d7243395a60174a6ea21f7a5148da06db4ace444741" Dec 03 22:42:23 crc kubenswrapper[4830]: I1203 22:42:23.935670 4830 scope.go:117] "RemoveContainer" containerID="67bad5d47240553a792fe38cbf37fa74ae64a93185be9de7e9fb4c044abfb63e" Dec 03 22:42:33 crc kubenswrapper[4830]: I1203 22:42:33.802934 4830 generic.go:334] "Generic (PLEG): container finished" podID="1c532c81-40fc-4058-bb22-abec161c538a" containerID="256a8033e1269238a9bf56a64e6c494ef6206b527bebb6fe6102eeaf1ab6c33b" exitCode=0 Dec 03 22:42:33 crc kubenswrapper[4830]: I1203 22:42:33.803032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" event={"ID":"1c532c81-40fc-4058-bb22-abec161c538a","Type":"ContainerDied","Data":"256a8033e1269238a9bf56a64e6c494ef6206b527bebb6fe6102eeaf1ab6c33b"} Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.348996 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354231 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354257 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354289 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354354 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354419 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354504 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354628 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5bxc\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354775 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354898 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.354936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"1c532c81-40fc-4058-bb22-abec161c538a\" (UID: \"1c532c81-40fc-4058-bb22-abec161c538a\") " Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.363102 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.363153 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.363681 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.364168 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.365002 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.365480 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.366140 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.367708 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.368800 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.375083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.376052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.376777 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc" (OuterVolumeSpecName: "kube-api-access-s5bxc") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "kube-api-access-s5bxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.414838 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.416409 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory" (OuterVolumeSpecName: "inventory") pod "1c532c81-40fc-4058-bb22-abec161c538a" (UID: "1c532c81-40fc-4058-bb22-abec161c538a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457185 4830 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457230 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457242 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457255 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457269 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457281 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457297 4830 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457309 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457323 4830 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457338 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457350 4830 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c532c81-40fc-4058-bb22-abec161c538a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457362 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5bxc\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-kube-api-access-s5bxc\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457374 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.457386 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1c532c81-40fc-4058-bb22-abec161c538a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.822665 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" event={"ID":"1c532c81-40fc-4058-bb22-abec161c538a","Type":"ContainerDied","Data":"5685743d236c8a306599df77cf36157b971df6843bfd97d3885babc58ba15733"} Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.822710 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.822716 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5685743d236c8a306599df77cf36157b971df6843bfd97d3885babc58ba15733" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.966771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq"] Dec 03 22:42:35 crc kubenswrapper[4830]: E1203 22:42:35.967602 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c532c81-40fc-4058-bb22-abec161c538a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.967625 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c532c81-40fc-4058-bb22-abec161c538a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.967867 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c532c81-40fc-4058-bb22-abec161c538a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.968818 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.970932 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.971490 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.971622 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.974947 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 22:42:35 crc kubenswrapper[4830]: I1203 22:42:35.975205 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.007503 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq"] Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.066909 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.067201 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qft\" (UniqueName: \"kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.067327 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.067434 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.067595 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.170357 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.170445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qft\" (UniqueName: \"kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.170486 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.170557 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.170628 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.172432 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.176489 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.177369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.181281 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.190455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qft\" (UniqueName: \"kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2wbdq\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.287176 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:42:36 crc kubenswrapper[4830]: I1203 22:42:36.868421 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq"] Dec 03 22:42:37 crc kubenswrapper[4830]: I1203 22:42:37.842863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" event={"ID":"f036d299-7239-400c-b3c4-f20ec8ed1f26","Type":"ContainerStarted","Data":"facbe3d12f5bb07749dc69c9ed97a787b5d916b8a28e047e58e74cafa4ce44ab"} Dec 03 22:42:37 crc kubenswrapper[4830]: I1203 22:42:37.843425 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" event={"ID":"f036d299-7239-400c-b3c4-f20ec8ed1f26","Type":"ContainerStarted","Data":"0a7dae9363a31579c72927ac40feb8e48773ef71584044117cad19da58bc0c05"} Dec 03 22:42:37 crc kubenswrapper[4830]: I1203 22:42:37.867557 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" podStartSLOduration=2.470609205 podStartE2EDuration="2.867531501s" podCreationTimestamp="2025-12-03 22:42:35 +0000 UTC" firstStartedPulling="2025-12-03 22:42:36.869759912 +0000 UTC m=+2245.866221261" lastFinishedPulling="2025-12-03 22:42:37.266682198 +0000 UTC m=+2246.263143557" observedRunningTime="2025-12-03 22:42:37.855810824 +0000 UTC m=+2246.852272213" watchObservedRunningTime="2025-12-03 22:42:37.867531501 +0000 UTC m=+2246.863992860" Dec 03 22:43:38 crc kubenswrapper[4830]: I1203 22:43:38.872311 4830 generic.go:334] "Generic (PLEG): container finished" podID="f036d299-7239-400c-b3c4-f20ec8ed1f26" containerID="facbe3d12f5bb07749dc69c9ed97a787b5d916b8a28e047e58e74cafa4ce44ab" exitCode=0 Dec 03 22:43:38 crc kubenswrapper[4830]: I1203 22:43:38.872402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" event={"ID":"f036d299-7239-400c-b3c4-f20ec8ed1f26","Type":"ContainerDied","Data":"facbe3d12f5bb07749dc69c9ed97a787b5d916b8a28e047e58e74cafa4ce44ab"} Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.307976 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.434170 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle\") pod \"f036d299-7239-400c-b3c4-f20ec8ed1f26\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.434284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5qft\" (UniqueName: \"kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft\") pod \"f036d299-7239-400c-b3c4-f20ec8ed1f26\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.434328 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key\") pod \"f036d299-7239-400c-b3c4-f20ec8ed1f26\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.434395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0\") pod \"f036d299-7239-400c-b3c4-f20ec8ed1f26\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.434435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory\") pod \"f036d299-7239-400c-b3c4-f20ec8ed1f26\" (UID: \"f036d299-7239-400c-b3c4-f20ec8ed1f26\") " Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.442696 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft" (OuterVolumeSpecName: "kube-api-access-p5qft") pod "f036d299-7239-400c-b3c4-f20ec8ed1f26" (UID: "f036d299-7239-400c-b3c4-f20ec8ed1f26"). InnerVolumeSpecName "kube-api-access-p5qft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.452764 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f036d299-7239-400c-b3c4-f20ec8ed1f26" (UID: "f036d299-7239-400c-b3c4-f20ec8ed1f26"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.473245 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f036d299-7239-400c-b3c4-f20ec8ed1f26" (UID: "f036d299-7239-400c-b3c4-f20ec8ed1f26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.493038 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f036d299-7239-400c-b3c4-f20ec8ed1f26" (UID: "f036d299-7239-400c-b3c4-f20ec8ed1f26"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.524663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory" (OuterVolumeSpecName: "inventory") pod "f036d299-7239-400c-b3c4-f20ec8ed1f26" (UID: "f036d299-7239-400c-b3c4-f20ec8ed1f26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.537005 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.537031 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.537041 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5qft\" (UniqueName: \"kubernetes.io/projected/f036d299-7239-400c-b3c4-f20ec8ed1f26-kube-api-access-p5qft\") on node \"crc\" DevicePath \"\"" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.537049 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f036d299-7239-400c-b3c4-f20ec8ed1f26-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.537058 4830 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f036d299-7239-400c-b3c4-f20ec8ed1f26-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.891322 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" event={"ID":"f036d299-7239-400c-b3c4-f20ec8ed1f26","Type":"ContainerDied","Data":"0a7dae9363a31579c72927ac40feb8e48773ef71584044117cad19da58bc0c05"} Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.891681 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a7dae9363a31579c72927ac40feb8e48773ef71584044117cad19da58bc0c05" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.891445 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2wbdq" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.989089 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz"] Dec 03 22:43:40 crc kubenswrapper[4830]: E1203 22:43:40.989758 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f036d299-7239-400c-b3c4-f20ec8ed1f26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.989789 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f036d299-7239-400c-b3c4-f20ec8ed1f26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.990169 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f036d299-7239-400c-b3c4-f20ec8ed1f26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.991392 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.994199 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.994448 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.996043 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.996153 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.996058 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:43:40 crc kubenswrapper[4830]: I1203 22:43:40.996433 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.015887 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz"] Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.045951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.045999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpv87\" (UniqueName: \"kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.046043 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.046225 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.046408 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.046439 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148611 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148651 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.148775 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpv87\" (UniqueName: \"kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.152986 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.157153 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.157288 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.157444 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.157481 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.170234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpv87\" (UniqueName: \"kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.310875 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.852485 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz"] Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.855218 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:43:41 crc kubenswrapper[4830]: I1203 22:43:41.901103 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" event={"ID":"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf","Type":"ContainerStarted","Data":"f7afe94dc43f3385cc52e3f941ae049fe0702fc6ea57dd6ca4641ea5ba73a037"} Dec 03 22:43:42 crc kubenswrapper[4830]: I1203 22:43:42.911634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" event={"ID":"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf","Type":"ContainerStarted","Data":"caa74147ba18e8a816d8c019ab127cbde156002a457c20aef8abbade004de9a1"} Dec 03 22:43:42 crc kubenswrapper[4830]: I1203 22:43:42.931942 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" podStartSLOduration=2.441493744 podStartE2EDuration="2.9319216s" podCreationTimestamp="2025-12-03 22:43:40 +0000 UTC" firstStartedPulling="2025-12-03 22:43:41.854984838 +0000 UTC m=+2310.851446177" lastFinishedPulling="2025-12-03 22:43:42.345412684 +0000 UTC m=+2311.341874033" observedRunningTime="2025-12-03 22:43:42.925531346 +0000 UTC m=+2311.921992695" watchObservedRunningTime="2025-12-03 22:43:42.9319216 +0000 UTC m=+2311.928382949" Dec 03 22:43:56 crc kubenswrapper[4830]: I1203 22:43:56.681091 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:43:56 crc kubenswrapper[4830]: I1203 22:43:56.681838 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:44:26 crc kubenswrapper[4830]: I1203 22:44:26.681665 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:44:26 crc kubenswrapper[4830]: I1203 22:44:26.683314 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:44:31 crc kubenswrapper[4830]: I1203 22:44:31.391164 4830 generic.go:334] "Generic (PLEG): container finished" podID="f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" containerID="caa74147ba18e8a816d8c019ab127cbde156002a457c20aef8abbade004de9a1" exitCode=0 Dec 03 22:44:31 crc kubenswrapper[4830]: I1203 22:44:31.391192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" event={"ID":"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf","Type":"ContainerDied","Data":"caa74147ba18e8a816d8c019ab127cbde156002a457c20aef8abbade004de9a1"} Dec 03 22:44:32 crc kubenswrapper[4830]: I1203 22:44:32.946836 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.082594 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.082666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.082702 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.082815 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.082971 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpv87\" (UniqueName: \"kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.083036 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key\") pod \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\" (UID: \"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf\") " Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.088420 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.089241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87" (OuterVolumeSpecName: "kube-api-access-xpv87") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "kube-api-access-xpv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.117124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory" (OuterVolumeSpecName: "inventory") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.118141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.119643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.124318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" (UID: "f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186469 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186536 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186554 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186568 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186582 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpv87\" (UniqueName: \"kubernetes.io/projected/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-kube-api-access-xpv87\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.186593 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.414537 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" event={"ID":"f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf","Type":"ContainerDied","Data":"f7afe94dc43f3385cc52e3f941ae049fe0702fc6ea57dd6ca4641ea5ba73a037"} Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.414579 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7afe94dc43f3385cc52e3f941ae049fe0702fc6ea57dd6ca4641ea5ba73a037" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.414623 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.508802 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7"] Dec 03 22:44:33 crc kubenswrapper[4830]: E1203 22:44:33.509249 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.509262 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.509666 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.513244 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.519061 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.519228 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.519299 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.519335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.519382 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.522970 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7"] Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.594501 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.594588 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.594818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbh8r\" (UniqueName: \"kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.595055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.595084 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.697174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.697248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.697322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbh8r\" (UniqueName: \"kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.697407 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.697428 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.702638 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.702850 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.702964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.704370 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.715387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbh8r\" (UniqueName: \"kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:33 crc kubenswrapper[4830]: I1203 22:44:33.849176 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:44:34 crc kubenswrapper[4830]: I1203 22:44:34.372178 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7"] Dec 03 22:44:34 crc kubenswrapper[4830]: I1203 22:44:34.424577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" event={"ID":"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51","Type":"ContainerStarted","Data":"0bcc00ff22a7ca1e8640cf16a83416f0cc08d25c79258e230d35a5c115d0086c"} Dec 03 22:44:37 crc kubenswrapper[4830]: I1203 22:44:37.458054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" event={"ID":"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51","Type":"ContainerStarted","Data":"8b6ac0c8093e2b8b6d57491b53e08a44f5fcf0ada3bb4aea37548b51c73e55ea"} Dec 03 22:44:37 crc kubenswrapper[4830]: I1203 22:44:37.481769 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" podStartSLOduration=2.632910566 podStartE2EDuration="4.481754229s" podCreationTimestamp="2025-12-03 22:44:33 +0000 UTC" firstStartedPulling="2025-12-03 22:44:34.377850736 +0000 UTC m=+2363.374312085" lastFinishedPulling="2025-12-03 22:44:36.226694389 +0000 UTC m=+2365.223155748" observedRunningTime="2025-12-03 22:44:37.470795463 +0000 UTC m=+2366.467256802" watchObservedRunningTime="2025-12-03 22:44:37.481754229 +0000 UTC m=+2366.478215578" Dec 03 22:44:56 crc kubenswrapper[4830]: I1203 22:44:56.681695 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:44:56 crc kubenswrapper[4830]: I1203 22:44:56.682425 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:44:56 crc kubenswrapper[4830]: I1203 22:44:56.682489 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:44:56 crc kubenswrapper[4830]: I1203 22:44:56.683306 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:44:56 crc kubenswrapper[4830]: I1203 22:44:56.683370 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" gracePeriod=600 Dec 03 22:44:56 crc kubenswrapper[4830]: E1203 22:44:56.801947 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:44:57 crc kubenswrapper[4830]: I1203 22:44:57.659430 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" exitCode=0 Dec 03 22:44:57 crc kubenswrapper[4830]: I1203 22:44:57.659545 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457"} Dec 03 22:44:57 crc kubenswrapper[4830]: I1203 22:44:57.659802 4830 scope.go:117] "RemoveContainer" containerID="d10b423867c2686e7447bb9deaddcfc31d35a58e073292e40d8a363528880e7e" Dec 03 22:44:57 crc kubenswrapper[4830]: I1203 22:44:57.660725 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:44:57 crc kubenswrapper[4830]: E1203 22:44:57.661157 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.141787 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j"] Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.143488 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.145630 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.145630 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.155997 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j"] Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.192922 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgj6\" (UniqueName: \"kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.193331 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.193451 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.294953 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgj6\" (UniqueName: \"kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.295078 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.295094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.295975 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.305256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.323556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgj6\" (UniqueName: \"kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6\") pod \"collect-profiles-29413365-h7s4j\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.512613 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:00 crc kubenswrapper[4830]: I1203 22:45:00.971300 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j"] Dec 03 22:45:00 crc kubenswrapper[4830]: W1203 22:45:00.978465 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4dac2c_1aff_4e52_9f49_3ef531261ded.slice/crio-08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178 WatchSource:0}: Error finding container 08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178: Status 404 returned error can't find the container with id 08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178 Dec 03 22:45:01 crc kubenswrapper[4830]: I1203 22:45:01.715282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" event={"ID":"bd4dac2c-1aff-4e52-9f49-3ef531261ded","Type":"ContainerStarted","Data":"08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178"} Dec 03 22:45:02 crc kubenswrapper[4830]: I1203 22:45:02.742876 4830 generic.go:334] "Generic (PLEG): container finished" podID="bd4dac2c-1aff-4e52-9f49-3ef531261ded" containerID="909ebb22507b12a9ba49af59ed00401cab47b0c0ad855ff56e331b287cede255" exitCode=0 Dec 03 22:45:02 crc kubenswrapper[4830]: I1203 22:45:02.743218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" event={"ID":"bd4dac2c-1aff-4e52-9f49-3ef531261ded","Type":"ContainerDied","Data":"909ebb22507b12a9ba49af59ed00401cab47b0c0ad855ff56e331b287cede255"} Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.208754 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.285713 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume\") pod \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.285791 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume\") pod \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.285818 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpgj6\" (UniqueName: \"kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6\") pod \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\" (UID: \"bd4dac2c-1aff-4e52-9f49-3ef531261ded\") " Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.286466 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd4dac2c-1aff-4e52-9f49-3ef531261ded" (UID: "bd4dac2c-1aff-4e52-9f49-3ef531261ded"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.286609 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd4dac2c-1aff-4e52-9f49-3ef531261ded-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.292737 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bd4dac2c-1aff-4e52-9f49-3ef531261ded" (UID: "bd4dac2c-1aff-4e52-9f49-3ef531261ded"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.292772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6" (OuterVolumeSpecName: "kube-api-access-mpgj6") pod "bd4dac2c-1aff-4e52-9f49-3ef531261ded" (UID: "bd4dac2c-1aff-4e52-9f49-3ef531261ded"). InnerVolumeSpecName "kube-api-access-mpgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.388342 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd4dac2c-1aff-4e52-9f49-3ef531261ded-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.388378 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpgj6\" (UniqueName: \"kubernetes.io/projected/bd4dac2c-1aff-4e52-9f49-3ef531261ded-kube-api-access-mpgj6\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.765440 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" event={"ID":"bd4dac2c-1aff-4e52-9f49-3ef531261ded","Type":"ContainerDied","Data":"08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178"} Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.765805 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d85edb716db2e9c159e29247494db031dbc2a7a132ffeafa541a9a08f4a178" Dec 03 22:45:04 crc kubenswrapper[4830]: I1203 22:45:04.765495 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-h7s4j" Dec 03 22:45:05 crc kubenswrapper[4830]: I1203 22:45:05.282923 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr"] Dec 03 22:45:05 crc kubenswrapper[4830]: I1203 22:45:05.294661 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-f4qrr"] Dec 03 22:45:05 crc kubenswrapper[4830]: I1203 22:45:05.351713 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3102808a-d149-4c79-bda3-29ce37d9b96b" path="/var/lib/kubelet/pods/3102808a-d149-4c79-bda3-29ce37d9b96b/volumes" Dec 03 22:45:10 crc kubenswrapper[4830]: I1203 22:45:10.338668 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:45:10 crc kubenswrapper[4830]: E1203 22:45:10.339902 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.271784 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:23 crc kubenswrapper[4830]: E1203 22:45:23.272951 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4dac2c-1aff-4e52-9f49-3ef531261ded" containerName="collect-profiles" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.272970 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4dac2c-1aff-4e52-9f49-3ef531261ded" containerName="collect-profiles" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.273219 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4dac2c-1aff-4e52-9f49-3ef531261ded" containerName="collect-profiles" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.275204 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.288710 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.376865 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.377161 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.377239 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8sf\" (UniqueName: \"kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.479054 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.479561 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.479611 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8sf\" (UniqueName: \"kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.479608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.480019 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.509468 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8sf\" (UniqueName: \"kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf\") pod \"redhat-marketplace-6sltp\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:23 crc kubenswrapper[4830]: I1203 22:45:23.600689 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:24 crc kubenswrapper[4830]: I1203 22:45:24.083107 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:24 crc kubenswrapper[4830]: I1203 22:45:24.117001 4830 scope.go:117] "RemoveContainer" containerID="e487489c506620faa80fb760895bc593939a98b7f17a67b39905a709677e219b" Dec 03 22:45:25 crc kubenswrapper[4830]: I1203 22:45:25.106627 4830 generic.go:334] "Generic (PLEG): container finished" podID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerID="3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23" exitCode=0 Dec 03 22:45:25 crc kubenswrapper[4830]: I1203 22:45:25.106690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerDied","Data":"3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23"} Dec 03 22:45:25 crc kubenswrapper[4830]: I1203 22:45:25.106920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerStarted","Data":"6dddb95e60b1ae4c25a670de1c6440e3112623e6ca3a5147699a720ed2a9eda0"} Dec 03 22:45:25 crc kubenswrapper[4830]: I1203 22:45:25.336869 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:45:25 crc kubenswrapper[4830]: E1203 22:45:25.337379 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:45:26 crc kubenswrapper[4830]: I1203 22:45:26.122206 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerStarted","Data":"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3"} Dec 03 22:45:27 crc kubenswrapper[4830]: I1203 22:45:27.133343 4830 generic.go:334] "Generic (PLEG): container finished" podID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerID="f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3" exitCode=0 Dec 03 22:45:27 crc kubenswrapper[4830]: I1203 22:45:27.133552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerDied","Data":"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3"} Dec 03 22:45:28 crc kubenswrapper[4830]: I1203 22:45:28.146738 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerStarted","Data":"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475"} Dec 03 22:45:28 crc kubenswrapper[4830]: I1203 22:45:28.176190 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6sltp" podStartSLOduration=2.7454287 podStartE2EDuration="5.176173243s" podCreationTimestamp="2025-12-03 22:45:23 +0000 UTC" firstStartedPulling="2025-12-03 22:45:25.110066513 +0000 UTC m=+2414.106527852" lastFinishedPulling="2025-12-03 22:45:27.540811036 +0000 UTC m=+2416.537272395" observedRunningTime="2025-12-03 22:45:28.16645787 +0000 UTC m=+2417.162919219" watchObservedRunningTime="2025-12-03 22:45:28.176173243 +0000 UTC m=+2417.172634592" Dec 03 22:45:33 crc kubenswrapper[4830]: I1203 22:45:33.601806 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:33 crc kubenswrapper[4830]: I1203 22:45:33.602447 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:33 crc kubenswrapper[4830]: I1203 22:45:33.653789 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:34 crc kubenswrapper[4830]: I1203 22:45:34.269887 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.259223 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.261267 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6sltp" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="registry-server" containerID="cri-o://a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475" gracePeriod=2 Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.337109 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:45:37 crc kubenswrapper[4830]: E1203 22:45:37.337607 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.792373 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.896180 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz8sf\" (UniqueName: \"kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf\") pod \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.896661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities\") pod \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.896903 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content\") pod \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\" (UID: \"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf\") " Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.898663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities" (OuterVolumeSpecName: "utilities") pod "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" (UID: "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.920254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf" (OuterVolumeSpecName: "kube-api-access-tz8sf") pod "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" (UID: "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf"). InnerVolumeSpecName "kube-api-access-tz8sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:45:37 crc kubenswrapper[4830]: I1203 22:45:37.929923 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" (UID: "8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.001005 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.001044 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.001057 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz8sf\" (UniqueName: \"kubernetes.io/projected/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf-kube-api-access-tz8sf\") on node \"crc\" DevicePath \"\"" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.256284 4830 generic.go:334] "Generic (PLEG): container finished" podID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerID="a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475" exitCode=0 Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.256337 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerDied","Data":"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475"} Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.256387 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sltp" event={"ID":"8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf","Type":"ContainerDied","Data":"6dddb95e60b1ae4c25a670de1c6440e3112623e6ca3a5147699a720ed2a9eda0"} Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.256409 4830 scope.go:117] "RemoveContainer" containerID="a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.256453 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sltp" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.283287 4830 scope.go:117] "RemoveContainer" containerID="f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.313814 4830 scope.go:117] "RemoveContainer" containerID="3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.327576 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.339257 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sltp"] Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.365885 4830 scope.go:117] "RemoveContainer" containerID="a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475" Dec 03 22:45:38 crc kubenswrapper[4830]: E1203 22:45:38.366603 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475\": container with ID starting with a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475 not found: ID does not exist" containerID="a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.366635 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475"} err="failed to get container status \"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475\": rpc error: code = NotFound desc = could not find container \"a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475\": container with ID starting with a0789b673d7b512dc6af11374a86e06490cf11dedd937ec2b6b0743c48d4a475 not found: ID does not exist" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.366674 4830 scope.go:117] "RemoveContainer" containerID="f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3" Dec 03 22:45:38 crc kubenswrapper[4830]: E1203 22:45:38.366922 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3\": container with ID starting with f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3 not found: ID does not exist" containerID="f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.366943 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3"} err="failed to get container status \"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3\": rpc error: code = NotFound desc = could not find container \"f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3\": container with ID starting with f8f9636707c2678f558461be4ff6dcb0a2ca43d2fabaa27ae5d178e84a1592e3 not found: ID does not exist" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.366958 4830 scope.go:117] "RemoveContainer" containerID="3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23" Dec 03 22:45:38 crc kubenswrapper[4830]: E1203 22:45:38.367488 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23\": container with ID starting with 3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23 not found: ID does not exist" containerID="3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23" Dec 03 22:45:38 crc kubenswrapper[4830]: I1203 22:45:38.367538 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23"} err="failed to get container status \"3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23\": rpc error: code = NotFound desc = could not find container \"3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23\": container with ID starting with 3103c60c4c29e782c130d1985bd3aa8a983e9d9685af4fe7476eff80558ffa23 not found: ID does not exist" Dec 03 22:45:39 crc kubenswrapper[4830]: I1203 22:45:39.348860 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" path="/var/lib/kubelet/pods/8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf/volumes" Dec 03 22:45:48 crc kubenswrapper[4830]: I1203 22:45:48.336865 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:45:48 crc kubenswrapper[4830]: E1203 22:45:48.338138 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:46:02 crc kubenswrapper[4830]: I1203 22:46:02.337769 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:46:02 crc kubenswrapper[4830]: E1203 22:46:02.338793 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:46:17 crc kubenswrapper[4830]: I1203 22:46:17.337358 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:46:17 crc kubenswrapper[4830]: E1203 22:46:17.338114 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:46:30 crc kubenswrapper[4830]: I1203 22:46:30.337656 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:46:30 crc kubenswrapper[4830]: E1203 22:46:30.338612 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:46:41 crc kubenswrapper[4830]: I1203 22:46:41.343669 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:46:41 crc kubenswrapper[4830]: E1203 22:46:41.344586 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:46:56 crc kubenswrapper[4830]: I1203 22:46:56.337638 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:46:56 crc kubenswrapper[4830]: E1203 22:46:56.341415 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:47:10 crc kubenswrapper[4830]: I1203 22:47:10.337080 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:47:10 crc kubenswrapper[4830]: E1203 22:47:10.337930 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:47:23 crc kubenswrapper[4830]: I1203 22:47:23.337500 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:47:23 crc kubenswrapper[4830]: E1203 22:47:23.338908 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:47:35 crc kubenswrapper[4830]: I1203 22:47:35.337928 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:47:35 crc kubenswrapper[4830]: E1203 22:47:35.338869 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:47:46 crc kubenswrapper[4830]: I1203 22:47:46.336644 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:47:46 crc kubenswrapper[4830]: E1203 22:47:46.337348 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:01 crc kubenswrapper[4830]: I1203 22:48:01.346409 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:48:01 crc kubenswrapper[4830]: E1203 22:48:01.347306 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:14 crc kubenswrapper[4830]: I1203 22:48:14.337476 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:48:14 crc kubenswrapper[4830]: E1203 22:48:14.338387 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:26 crc kubenswrapper[4830]: I1203 22:48:26.336888 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:48:26 crc kubenswrapper[4830]: E1203 22:48:26.337885 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:39 crc kubenswrapper[4830]: I1203 22:48:39.337831 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:48:39 crc kubenswrapper[4830]: E1203 22:48:39.338646 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:52 crc kubenswrapper[4830]: I1203 22:48:52.337669 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:48:52 crc kubenswrapper[4830]: E1203 22:48:52.338413 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:48:55 crc kubenswrapper[4830]: I1203 22:48:55.244719 4830 generic.go:334] "Generic (PLEG): container finished" podID="dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" containerID="8b6ac0c8093e2b8b6d57491b53e08a44f5fcf0ada3bb4aea37548b51c73e55ea" exitCode=0 Dec 03 22:48:55 crc kubenswrapper[4830]: I1203 22:48:55.245184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" event={"ID":"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51","Type":"ContainerDied","Data":"8b6ac0c8093e2b8b6d57491b53e08a44f5fcf0ada3bb4aea37548b51c73e55ea"} Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.767683 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.814101 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbh8r\" (UniqueName: \"kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r\") pod \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.814240 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory\") pod \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.814349 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0\") pod \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.814439 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle\") pod \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.814528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key\") pod \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\" (UID: \"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51\") " Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.821662 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" (UID: "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.822345 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r" (OuterVolumeSpecName: "kube-api-access-xbh8r") pod "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" (UID: "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51"). InnerVolumeSpecName "kube-api-access-xbh8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.845372 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory" (OuterVolumeSpecName: "inventory") pod "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" (UID: "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.853023 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" (UID: "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.854759 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" (UID: "dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.917225 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.917259 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.917269 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbh8r\" (UniqueName: \"kubernetes.io/projected/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-kube-api-access-xbh8r\") on node \"crc\" DevicePath \"\"" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.917278 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:48:56 crc kubenswrapper[4830]: I1203 22:48:56.917289 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.277626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" event={"ID":"dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51","Type":"ContainerDied","Data":"0bcc00ff22a7ca1e8640cf16a83416f0cc08d25c79258e230d35a5c115d0086c"} Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.277663 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.277684 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bcc00ff22a7ca1e8640cf16a83416f0cc08d25c79258e230d35a5c115d0086c" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.369266 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b"] Dec 03 22:48:57 crc kubenswrapper[4830]: E1203 22:48:57.370144 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="extract-content" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370166 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="extract-content" Dec 03 22:48:57 crc kubenswrapper[4830]: E1203 22:48:57.370193 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="extract-utilities" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370202 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="extract-utilities" Dec 03 22:48:57 crc kubenswrapper[4830]: E1203 22:48:57.370240 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370248 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 22:48:57 crc kubenswrapper[4830]: E1203 22:48:57.370266 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="registry-server" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370274 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="registry-server" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370462 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.370482 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfe18b8-82cf-4a9f-afd6-41e86f0b15cf" containerName="registry-server" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.371153 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.373145 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.373641 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.373963 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.374229 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.374726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.374948 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.375191 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.393581 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b"] Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432078 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432419 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432455 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432601 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhgh\" (UniqueName: \"kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432841 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.432898 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.433390 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.433469 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.534928 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhgh\" (UniqueName: \"kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535020 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535044 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535068 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535096 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535135 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535230 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.535246 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.536005 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.539369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.539773 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.539840 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.540933 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.541078 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.541603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.543994 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.551053 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhgh\" (UniqueName: \"kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cjj5b\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:57 crc kubenswrapper[4830]: I1203 22:48:57.689060 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:48:58 crc kubenswrapper[4830]: I1203 22:48:58.217750 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b"] Dec 03 22:48:58 crc kubenswrapper[4830]: I1203 22:48:58.223304 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:48:58 crc kubenswrapper[4830]: I1203 22:48:58.288199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" event={"ID":"fde89bd5-aa9c-44c2-b854-696c3e0f50e7","Type":"ContainerStarted","Data":"155ca1d1c731abaeda6735b55859321354df4be75bf5a77e390b048df3a8b062"} Dec 03 22:48:59 crc kubenswrapper[4830]: I1203 22:48:59.299434 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" event={"ID":"fde89bd5-aa9c-44c2-b854-696c3e0f50e7","Type":"ContainerStarted","Data":"bc6303a48751b202f3236eaab67050ffea85ce2428e1a3334037518833b8c3e1"} Dec 03 22:48:59 crc kubenswrapper[4830]: I1203 22:48:59.333275 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" podStartSLOduration=1.923543937 podStartE2EDuration="2.333257211s" podCreationTimestamp="2025-12-03 22:48:57 +0000 UTC" firstStartedPulling="2025-12-03 22:48:58.22304164 +0000 UTC m=+2627.219502989" lastFinishedPulling="2025-12-03 22:48:58.632754904 +0000 UTC m=+2627.629216263" observedRunningTime="2025-12-03 22:48:59.328184695 +0000 UTC m=+2628.324646044" watchObservedRunningTime="2025-12-03 22:48:59.333257211 +0000 UTC m=+2628.329718560" Dec 03 22:49:03 crc kubenswrapper[4830]: I1203 22:49:03.336986 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:49:03 crc kubenswrapper[4830]: E1203 22:49:03.337889 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:49:17 crc kubenswrapper[4830]: I1203 22:49:17.337577 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:49:17 crc kubenswrapper[4830]: E1203 22:49:17.338303 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:49:28 crc kubenswrapper[4830]: I1203 22:49:28.337220 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:49:28 crc kubenswrapper[4830]: E1203 22:49:28.338179 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:49:39 crc kubenswrapper[4830]: I1203 22:49:39.337055 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:49:39 crc kubenswrapper[4830]: E1203 22:49:39.339807 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:49:50 crc kubenswrapper[4830]: I1203 22:49:50.337859 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:49:50 crc kubenswrapper[4830]: E1203 22:49:50.340003 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:50:03 crc kubenswrapper[4830]: I1203 22:50:03.337339 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:50:04 crc kubenswrapper[4830]: I1203 22:50:04.090271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc"} Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.296833 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.299540 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.316431 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.439158 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdmbx\" (UniqueName: \"kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.439272 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.439369 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.541368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdmbx\" (UniqueName: \"kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.541501 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.541627 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.542107 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.543279 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.564232 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdmbx\" (UniqueName: \"kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx\") pod \"community-operators-2z8kw\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:27 crc kubenswrapper[4830]: I1203 22:50:27.662840 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:28 crc kubenswrapper[4830]: I1203 22:50:28.178035 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:28 crc kubenswrapper[4830]: I1203 22:50:28.649987 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerID="3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0" exitCode=0 Dec 03 22:50:28 crc kubenswrapper[4830]: I1203 22:50:28.650081 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerDied","Data":"3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0"} Dec 03 22:50:28 crc kubenswrapper[4830]: I1203 22:50:28.650323 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerStarted","Data":"1dc1624f299317b724dfe707ca94397a1c752f032774dfe4170f099429849ef7"} Dec 03 22:50:31 crc kubenswrapper[4830]: I1203 22:50:31.710600 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerStarted","Data":"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715"} Dec 03 22:50:32 crc kubenswrapper[4830]: I1203 22:50:32.722617 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerID="30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715" exitCode=0 Dec 03 22:50:32 crc kubenswrapper[4830]: I1203 22:50:32.722668 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerDied","Data":"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715"} Dec 03 22:50:34 crc kubenswrapper[4830]: I1203 22:50:34.743108 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerStarted","Data":"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba"} Dec 03 22:50:34 crc kubenswrapper[4830]: I1203 22:50:34.770087 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2z8kw" podStartSLOduration=2.508060013 podStartE2EDuration="7.770069861s" podCreationTimestamp="2025-12-03 22:50:27 +0000 UTC" firstStartedPulling="2025-12-03 22:50:28.653313679 +0000 UTC m=+2717.649775028" lastFinishedPulling="2025-12-03 22:50:33.915323527 +0000 UTC m=+2722.911784876" observedRunningTime="2025-12-03 22:50:34.762021573 +0000 UTC m=+2723.758482932" watchObservedRunningTime="2025-12-03 22:50:34.770069861 +0000 UTC m=+2723.766531210" Dec 03 22:50:37 crc kubenswrapper[4830]: I1203 22:50:37.680703 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:37 crc kubenswrapper[4830]: I1203 22:50:37.681127 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:37 crc kubenswrapper[4830]: I1203 22:50:37.740952 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:47 crc kubenswrapper[4830]: I1203 22:50:47.716222 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:47 crc kubenswrapper[4830]: I1203 22:50:47.770652 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:47 crc kubenswrapper[4830]: I1203 22:50:47.874683 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2z8kw" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="registry-server" containerID="cri-o://2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba" gracePeriod=2 Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.394545 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.546255 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities\") pod \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.546574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdmbx\" (UniqueName: \"kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx\") pod \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.546598 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content\") pod \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\" (UID: \"1b1eaaa6-80fc-4eb4-882c-cffb094b916d\") " Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.550089 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities" (OuterVolumeSpecName: "utilities") pod "1b1eaaa6-80fc-4eb4-882c-cffb094b916d" (UID: "1b1eaaa6-80fc-4eb4-882c-cffb094b916d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.554490 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx" (OuterVolumeSpecName: "kube-api-access-vdmbx") pod "1b1eaaa6-80fc-4eb4-882c-cffb094b916d" (UID: "1b1eaaa6-80fc-4eb4-882c-cffb094b916d"). InnerVolumeSpecName "kube-api-access-vdmbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.602931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b1eaaa6-80fc-4eb4-882c-cffb094b916d" (UID: "1b1eaaa6-80fc-4eb4-882c-cffb094b916d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.648725 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdmbx\" (UniqueName: \"kubernetes.io/projected/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-kube-api-access-vdmbx\") on node \"crc\" DevicePath \"\"" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.648780 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.648795 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1eaaa6-80fc-4eb4-882c-cffb094b916d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.887651 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerID="2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba" exitCode=0 Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.887740 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8kw" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.887755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerDied","Data":"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba"} Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.889104 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8kw" event={"ID":"1b1eaaa6-80fc-4eb4-882c-cffb094b916d","Type":"ContainerDied","Data":"1dc1624f299317b724dfe707ca94397a1c752f032774dfe4170f099429849ef7"} Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.889130 4830 scope.go:117] "RemoveContainer" containerID="2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.916191 4830 scope.go:117] "RemoveContainer" containerID="30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715" Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.927582 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.939688 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2z8kw"] Dec 03 22:50:48 crc kubenswrapper[4830]: I1203 22:50:48.940803 4830 scope.go:117] "RemoveContainer" containerID="3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.000116 4830 scope.go:117] "RemoveContainer" containerID="2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba" Dec 03 22:50:49 crc kubenswrapper[4830]: E1203 22:50:49.000621 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba\": container with ID starting with 2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba not found: ID does not exist" containerID="2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.000658 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba"} err="failed to get container status \"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba\": rpc error: code = NotFound desc = could not find container \"2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba\": container with ID starting with 2b8ae98fc281b41d91d84dee4b36ebbaa71bf3d0179173689ef41970ad9a8eba not found: ID does not exist" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.000686 4830 scope.go:117] "RemoveContainer" containerID="30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715" Dec 03 22:50:49 crc kubenswrapper[4830]: E1203 22:50:49.001116 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715\": container with ID starting with 30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715 not found: ID does not exist" containerID="30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.001173 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715"} err="failed to get container status \"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715\": rpc error: code = NotFound desc = could not find container \"30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715\": container with ID starting with 30c935a61388f8cb2c3651ebb9abcfa99ce3a23cbd64ceaf778c84d75f469715 not found: ID does not exist" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.001214 4830 scope.go:117] "RemoveContainer" containerID="3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0" Dec 03 22:50:49 crc kubenswrapper[4830]: E1203 22:50:49.001503 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0\": container with ID starting with 3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0 not found: ID does not exist" containerID="3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.001632 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0"} err="failed to get container status \"3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0\": rpc error: code = NotFound desc = could not find container \"3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0\": container with ID starting with 3d71c7116dbeb685368d4d769e1c106376e9d8f6978a917bd2b81ae7508cdbf0 not found: ID does not exist" Dec 03 22:50:49 crc kubenswrapper[4830]: I1203 22:50:49.348170 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" path="/var/lib/kubelet/pods/1b1eaaa6-80fc-4eb4-882c-cffb094b916d/volumes" Dec 03 22:51:43 crc kubenswrapper[4830]: I1203 22:51:43.446031 4830 generic.go:334] "Generic (PLEG): container finished" podID="fde89bd5-aa9c-44c2-b854-696c3e0f50e7" containerID="bc6303a48751b202f3236eaab67050ffea85ce2428e1a3334037518833b8c3e1" exitCode=0 Dec 03 22:51:43 crc kubenswrapper[4830]: I1203 22:51:43.446155 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" event={"ID":"fde89bd5-aa9c-44c2-b854-696c3e0f50e7","Type":"ContainerDied","Data":"bc6303a48751b202f3236eaab67050ffea85ce2428e1a3334037518833b8c3e1"} Dec 03 22:51:44 crc kubenswrapper[4830]: I1203 22:51:44.946215 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.002598 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.002685 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.002717 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.002825 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.002865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqhgh\" (UniqueName: \"kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.003001 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.003029 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.003102 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.003133 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key\") pod \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\" (UID: \"fde89bd5-aa9c-44c2-b854-696c3e0f50e7\") " Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.028968 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh" (OuterVolumeSpecName: "kube-api-access-kqhgh") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "kube-api-access-kqhgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.032609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.048668 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.050638 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory" (OuterVolumeSpecName: "inventory") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.051941 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.060760 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.078098 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.083976 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.092248 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "fde89bd5-aa9c-44c2-b854-696c3e0f50e7" (UID: "fde89bd5-aa9c-44c2-b854-696c3e0f50e7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106000 4830 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106032 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106041 4830 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106050 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106060 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqhgh\" (UniqueName: \"kubernetes.io/projected/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-kube-api-access-kqhgh\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106068 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106078 4830 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106087 4830 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.106094 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde89bd5-aa9c-44c2-b854-696c3e0f50e7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.468944 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" event={"ID":"fde89bd5-aa9c-44c2-b854-696c3e0f50e7","Type":"ContainerDied","Data":"155ca1d1c731abaeda6735b55859321354df4be75bf5a77e390b048df3a8b062"} Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.469224 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="155ca1d1c731abaeda6735b55859321354df4be75bf5a77e390b048df3a8b062" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.469006 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cjj5b" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.581855 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl"] Dec 03 22:51:45 crc kubenswrapper[4830]: E1203 22:51:45.582246 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="extract-utilities" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582264 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="extract-utilities" Dec 03 22:51:45 crc kubenswrapper[4830]: E1203 22:51:45.582282 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="registry-server" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582288 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="registry-server" Dec 03 22:51:45 crc kubenswrapper[4830]: E1203 22:51:45.582301 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde89bd5-aa9c-44c2-b854-696c3e0f50e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582307 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde89bd5-aa9c-44c2-b854-696c3e0f50e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 22:51:45 crc kubenswrapper[4830]: E1203 22:51:45.582330 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="extract-content" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582336 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="extract-content" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582587 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1eaaa6-80fc-4eb4-882c-cffb094b916d" containerName="registry-server" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.582614 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde89bd5-aa9c-44c2-b854-696c3e0f50e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.583367 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.585789 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.586398 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.586462 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.586827 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.589311 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxxv8" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.598230 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl"] Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.615426 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.615496 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d68g\" (UniqueName: \"kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.615563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.615602 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.615642 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.616288 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.616377 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718370 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718429 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d68g\" (UniqueName: \"kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718501 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.718559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.723340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.723340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.724878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.725178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.725693 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.726271 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.746334 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d68g\" (UniqueName: \"kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:45 crc kubenswrapper[4830]: I1203 22:51:45.917648 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:51:46 crc kubenswrapper[4830]: I1203 22:51:46.458670 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl"] Dec 03 22:51:46 crc kubenswrapper[4830]: I1203 22:51:46.480915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" event={"ID":"3cbad56b-f578-4ad4-bdb4-13c72261814d","Type":"ContainerStarted","Data":"50f4bfc96eb7c2e9eaa8abf2f967cc3b2e0ea957e1eda5f7b6a941466e2896d4"} Dec 03 22:51:47 crc kubenswrapper[4830]: I1203 22:51:47.491686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" event={"ID":"3cbad56b-f578-4ad4-bdb4-13c72261814d","Type":"ContainerStarted","Data":"fb4061fb14a895895d74677aa3d499a10eae3bcf3a0cdea569c2996cd04f5461"} Dec 03 22:51:47 crc kubenswrapper[4830]: I1203 22:51:47.513608 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" podStartSLOduration=2.102537877 podStartE2EDuration="2.513585936s" podCreationTimestamp="2025-12-03 22:51:45 +0000 UTC" firstStartedPulling="2025-12-03 22:51:46.461805778 +0000 UTC m=+2795.458267127" lastFinishedPulling="2025-12-03 22:51:46.872853837 +0000 UTC m=+2795.869315186" observedRunningTime="2025-12-03 22:51:47.505273301 +0000 UTC m=+2796.501734650" watchObservedRunningTime="2025-12-03 22:51:47.513585936 +0000 UTC m=+2796.510047285" Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.865183 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.868613 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.874935 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.954745 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgk7r\" (UniqueName: \"kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.954822 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:08 crc kubenswrapper[4830]: I1203 22:52:08.955300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.056944 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.057246 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgk7r\" (UniqueName: \"kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.057349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.057385 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.057619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.085596 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgk7r\" (UniqueName: \"kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r\") pod \"certified-operators-qw2j4\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.196498 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:09 crc kubenswrapper[4830]: I1203 22:52:09.749344 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:10 crc kubenswrapper[4830]: I1203 22:52:10.718384 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerStarted","Data":"0a44cf45cc52880982d583768bc9a32ddd6277c0ff25c17904bcfa292a55297d"} Dec 03 22:52:11 crc kubenswrapper[4830]: I1203 22:52:11.729755 4830 generic.go:334] "Generic (PLEG): container finished" podID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerID="d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163" exitCode=0 Dec 03 22:52:11 crc kubenswrapper[4830]: I1203 22:52:11.729806 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerDied","Data":"d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163"} Dec 03 22:52:13 crc kubenswrapper[4830]: I1203 22:52:13.751404 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerStarted","Data":"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1"} Dec 03 22:52:14 crc kubenswrapper[4830]: I1203 22:52:14.764758 4830 generic.go:334] "Generic (PLEG): container finished" podID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerID="afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1" exitCode=0 Dec 03 22:52:14 crc kubenswrapper[4830]: I1203 22:52:14.764821 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerDied","Data":"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1"} Dec 03 22:52:17 crc kubenswrapper[4830]: I1203 22:52:17.792076 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerStarted","Data":"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1"} Dec 03 22:52:17 crc kubenswrapper[4830]: I1203 22:52:17.813077 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qw2j4" podStartSLOduration=6.030608029 podStartE2EDuration="9.813059306s" podCreationTimestamp="2025-12-03 22:52:08 +0000 UTC" firstStartedPulling="2025-12-03 22:52:11.732073291 +0000 UTC m=+2820.728534640" lastFinishedPulling="2025-12-03 22:52:15.514524568 +0000 UTC m=+2824.510985917" observedRunningTime="2025-12-03 22:52:17.809019027 +0000 UTC m=+2826.805480396" watchObservedRunningTime="2025-12-03 22:52:17.813059306 +0000 UTC m=+2826.809520685" Dec 03 22:52:19 crc kubenswrapper[4830]: I1203 22:52:19.196795 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:19 crc kubenswrapper[4830]: I1203 22:52:19.197247 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:19 crc kubenswrapper[4830]: I1203 22:52:19.249353 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:26 crc kubenswrapper[4830]: I1203 22:52:26.680932 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:52:26 crc kubenswrapper[4830]: I1203 22:52:26.681477 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:52:29 crc kubenswrapper[4830]: I1203 22:52:29.249177 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:29 crc kubenswrapper[4830]: I1203 22:52:29.317019 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:29 crc kubenswrapper[4830]: I1203 22:52:29.928932 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qw2j4" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="registry-server" containerID="cri-o://c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1" gracePeriod=2 Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.468721 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.611677 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities\") pod \"403579ed-d48d-479b-80f0-85d1c0f3fec2\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.612633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities" (OuterVolumeSpecName: "utilities") pod "403579ed-d48d-479b-80f0-85d1c0f3fec2" (UID: "403579ed-d48d-479b-80f0-85d1c0f3fec2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.612697 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgk7r\" (UniqueName: \"kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r\") pod \"403579ed-d48d-479b-80f0-85d1c0f3fec2\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.612937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content\") pod \"403579ed-d48d-479b-80f0-85d1c0f3fec2\" (UID: \"403579ed-d48d-479b-80f0-85d1c0f3fec2\") " Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.613761 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.617663 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r" (OuterVolumeSpecName: "kube-api-access-bgk7r") pod "403579ed-d48d-479b-80f0-85d1c0f3fec2" (UID: "403579ed-d48d-479b-80f0-85d1c0f3fec2"). InnerVolumeSpecName "kube-api-access-bgk7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.654606 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "403579ed-d48d-479b-80f0-85d1c0f3fec2" (UID: "403579ed-d48d-479b-80f0-85d1c0f3fec2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.715165 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgk7r\" (UniqueName: \"kubernetes.io/projected/403579ed-d48d-479b-80f0-85d1c0f3fec2-kube-api-access-bgk7r\") on node \"crc\" DevicePath \"\"" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.715195 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/403579ed-d48d-479b-80f0-85d1c0f3fec2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.945941 4830 generic.go:334] "Generic (PLEG): container finished" podID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerID="c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1" exitCode=0 Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.946246 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw2j4" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.946253 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerDied","Data":"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1"} Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.946577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw2j4" event={"ID":"403579ed-d48d-479b-80f0-85d1c0f3fec2","Type":"ContainerDied","Data":"0a44cf45cc52880982d583768bc9a32ddd6277c0ff25c17904bcfa292a55297d"} Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.946601 4830 scope.go:117] "RemoveContainer" containerID="c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.992319 4830 scope.go:117] "RemoveContainer" containerID="afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1" Dec 03 22:52:30 crc kubenswrapper[4830]: I1203 22:52:30.996395 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.007044 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qw2j4"] Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.021200 4830 scope.go:117] "RemoveContainer" containerID="d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.067780 4830 scope.go:117] "RemoveContainer" containerID="c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1" Dec 03 22:52:31 crc kubenswrapper[4830]: E1203 22:52:31.068249 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1\": container with ID starting with c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1 not found: ID does not exist" containerID="c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.068297 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1"} err="failed to get container status \"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1\": rpc error: code = NotFound desc = could not find container \"c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1\": container with ID starting with c6e21e9842049a6b94c1aef962a0909aa8d36a85efca71760a19a4c35ffd87b1 not found: ID does not exist" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.068324 4830 scope.go:117] "RemoveContainer" containerID="afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1" Dec 03 22:52:31 crc kubenswrapper[4830]: E1203 22:52:31.068611 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1\": container with ID starting with afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1 not found: ID does not exist" containerID="afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.068644 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1"} err="failed to get container status \"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1\": rpc error: code = NotFound desc = could not find container \"afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1\": container with ID starting with afca8b25172577a8aeed2ff19d202886eb911e734e4c65b11cd725ddc2e0b5a1 not found: ID does not exist" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.068663 4830 scope.go:117] "RemoveContainer" containerID="d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163" Dec 03 22:52:31 crc kubenswrapper[4830]: E1203 22:52:31.068989 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163\": container with ID starting with d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163 not found: ID does not exist" containerID="d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.069048 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163"} err="failed to get container status \"d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163\": rpc error: code = NotFound desc = could not find container \"d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163\": container with ID starting with d036b03e84aa6665b9cdf8f30fb5d0606425fe7e5a9b082e10b3d6747fbac163 not found: ID does not exist" Dec 03 22:52:31 crc kubenswrapper[4830]: I1203 22:52:31.352028 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" path="/var/lib/kubelet/pods/403579ed-d48d-479b-80f0-85d1c0f3fec2/volumes" Dec 03 22:52:56 crc kubenswrapper[4830]: I1203 22:52:56.681406 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:52:56 crc kubenswrapper[4830]: I1203 22:52:56.681995 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:53:26 crc kubenswrapper[4830]: I1203 22:53:26.681329 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:53:26 crc kubenswrapper[4830]: I1203 22:53:26.681819 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:53:26 crc kubenswrapper[4830]: I1203 22:53:26.681882 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:53:26 crc kubenswrapper[4830]: I1203 22:53:26.682895 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:53:26 crc kubenswrapper[4830]: I1203 22:53:26.682983 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc" gracePeriod=600 Dec 03 22:53:29 crc kubenswrapper[4830]: I1203 22:53:29.534917 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc" exitCode=0 Dec 03 22:53:29 crc kubenswrapper[4830]: I1203 22:53:29.534989 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc"} Dec 03 22:53:29 crc kubenswrapper[4830]: I1203 22:53:29.535059 4830 scope.go:117] "RemoveContainer" containerID="a5cb1e0dd4f587a441f3689fbc61a167c3fc0ea55c45bd545220b01eda3d1457" Dec 03 22:53:30 crc kubenswrapper[4830]: I1203 22:53:30.548278 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1"} Dec 03 22:54:04 crc kubenswrapper[4830]: I1203 22:54:04.903671 4830 generic.go:334] "Generic (PLEG): container finished" podID="3cbad56b-f578-4ad4-bdb4-13c72261814d" containerID="fb4061fb14a895895d74677aa3d499a10eae3bcf3a0cdea569c2996cd04f5461" exitCode=0 Dec 03 22:54:04 crc kubenswrapper[4830]: I1203 22:54:04.903736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" event={"ID":"3cbad56b-f578-4ad4-bdb4-13c72261814d","Type":"ContainerDied","Data":"fb4061fb14a895895d74677aa3d499a10eae3bcf3a0cdea569c2996cd04f5461"} Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.383118 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.526668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527075 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d68g\" (UniqueName: \"kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527134 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527162 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527202 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.527280 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key\") pod \"3cbad56b-f578-4ad4-bdb4-13c72261814d\" (UID: \"3cbad56b-f578-4ad4-bdb4-13c72261814d\") " Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.532759 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.534848 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g" (OuterVolumeSpecName: "kube-api-access-2d68g") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "kube-api-access-2d68g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.559470 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.561755 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory" (OuterVolumeSpecName: "inventory") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.562683 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.573110 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.573220 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3cbad56b-f578-4ad4-bdb4-13c72261814d" (UID: "3cbad56b-f578-4ad4-bdb4-13c72261814d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629571 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629607 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d68g\" (UniqueName: \"kubernetes.io/projected/3cbad56b-f578-4ad4-bdb4-13c72261814d-kube-api-access-2d68g\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629617 4830 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629644 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629656 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629667 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.629677 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cbad56b-f578-4ad4-bdb4-13c72261814d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.933272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" event={"ID":"3cbad56b-f578-4ad4-bdb4-13c72261814d","Type":"ContainerDied","Data":"50f4bfc96eb7c2e9eaa8abf2f967cc3b2e0ea957e1eda5f7b6a941466e2896d4"} Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.933318 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f4bfc96eb7c2e9eaa8abf2f967cc3b2e0ea957e1eda5f7b6a941466e2896d4" Dec 03 22:54:06 crc kubenswrapper[4830]: I1203 22:54:06.933324 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl" Dec 03 22:54:55 crc kubenswrapper[4830]: E1203 22:54:55.720103 4830 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.217:34934->38.102.83.217:42691: write tcp 38.102.83.217:34934->38.102.83.217:42691: write: broken pipe Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.534321 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:21 crc kubenswrapper[4830]: E1203 22:55:21.535724 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="extract-utilities" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.535738 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="extract-utilities" Dec 03 22:55:21 crc kubenswrapper[4830]: E1203 22:55:21.535768 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbad56b-f578-4ad4-bdb4-13c72261814d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.535775 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbad56b-f578-4ad4-bdb4-13c72261814d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 22:55:21 crc kubenswrapper[4830]: E1203 22:55:21.535791 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="registry-server" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.535796 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="registry-server" Dec 03 22:55:21 crc kubenswrapper[4830]: E1203 22:55:21.535811 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="extract-content" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.535818 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="extract-content" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.536045 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cbad56b-f578-4ad4-bdb4-13c72261814d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.536068 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="403579ed-d48d-479b-80f0-85d1c0f3fec2" containerName="registry-server" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.537607 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.547782 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.682947 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvc8\" (UniqueName: \"kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.682998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.683123 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.785924 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.785407 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.787212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvc8\" (UniqueName: \"kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.787259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.787689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.820445 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvc8\" (UniqueName: \"kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8\") pod \"redhat-operators-9c65t\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:21 crc kubenswrapper[4830]: I1203 22:55:21.861131 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:22 crc kubenswrapper[4830]: I1203 22:55:22.485664 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:23 crc kubenswrapper[4830]: I1203 22:55:23.044735 4830 generic.go:334] "Generic (PLEG): container finished" podID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerID="16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55" exitCode=0 Dec 03 22:55:23 crc kubenswrapper[4830]: I1203 22:55:23.044865 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerDied","Data":"16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55"} Dec 03 22:55:23 crc kubenswrapper[4830]: I1203 22:55:23.045080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerStarted","Data":"76dbaef0248dcab27e935a8417eddd5c800defbba2ea7d66e75fc89682184404"} Dec 03 22:55:23 crc kubenswrapper[4830]: I1203 22:55:23.047065 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:55:24 crc kubenswrapper[4830]: I1203 22:55:24.055234 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerStarted","Data":"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e"} Dec 03 22:55:27 crc kubenswrapper[4830]: I1203 22:55:27.082237 4830 generic.go:334] "Generic (PLEG): container finished" podID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerID="bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e" exitCode=0 Dec 03 22:55:27 crc kubenswrapper[4830]: I1203 22:55:27.082318 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerDied","Data":"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e"} Dec 03 22:55:29 crc kubenswrapper[4830]: I1203 22:55:29.104896 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerStarted","Data":"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b"} Dec 03 22:55:30 crc kubenswrapper[4830]: I1203 22:55:30.140820 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9c65t" podStartSLOduration=3.565416958 podStartE2EDuration="9.140795417s" podCreationTimestamp="2025-12-03 22:55:21 +0000 UTC" firstStartedPulling="2025-12-03 22:55:23.046706272 +0000 UTC m=+3012.043167611" lastFinishedPulling="2025-12-03 22:55:28.622084721 +0000 UTC m=+3017.618546070" observedRunningTime="2025-12-03 22:55:30.13462858 +0000 UTC m=+3019.131089929" watchObservedRunningTime="2025-12-03 22:55:30.140795417 +0000 UTC m=+3019.137256766" Dec 03 22:55:31 crc kubenswrapper[4830]: I1203 22:55:31.865665 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:31 crc kubenswrapper[4830]: I1203 22:55:31.865723 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:32 crc kubenswrapper[4830]: I1203 22:55:32.920197 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9c65t" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="registry-server" probeResult="failure" output=< Dec 03 22:55:32 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 22:55:32 crc kubenswrapper[4830]: > Dec 03 22:55:41 crc kubenswrapper[4830]: I1203 22:55:41.918187 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:41 crc kubenswrapper[4830]: I1203 22:55:41.975649 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:42 crc kubenswrapper[4830]: I1203 22:55:42.159758 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.232896 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9c65t" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="registry-server" containerID="cri-o://29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b" gracePeriod=2 Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.776031 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.852054 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities\") pod \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.852207 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvc8\" (UniqueName: \"kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8\") pod \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.852236 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content\") pod \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\" (UID: \"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e\") " Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.853361 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities" (OuterVolumeSpecName: "utilities") pod "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" (UID: "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.859164 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8" (OuterVolumeSpecName: "kube-api-access-fdvc8") pod "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" (UID: "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e"). InnerVolumeSpecName "kube-api-access-fdvc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.867470 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvc8\" (UniqueName: \"kubernetes.io/projected/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-kube-api-access-fdvc8\") on node \"crc\" DevicePath \"\"" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.867637 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.895727 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 22:55:43 crc kubenswrapper[4830]: E1203 22:55:43.896569 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="extract-content" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.896675 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="extract-content" Dec 03 22:55:43 crc kubenswrapper[4830]: E1203 22:55:43.896761 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="registry-server" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.896843 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="registry-server" Dec 03 22:55:43 crc kubenswrapper[4830]: E1203 22:55:43.896970 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="extract-utilities" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.897047 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="extract-utilities" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.897357 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerName="registry-server" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.898610 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.900944 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.901351 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.901543 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.901349 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w6mk7" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.906011 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkckx\" (UniqueName: \"kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969726 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969790 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969829 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.969960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:43 crc kubenswrapper[4830]: I1203 22:55:43.982719 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" (UID: "2ef31bb5-6d0b-4739-8129-c6c7c847ee3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.071503 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.071589 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkckx\" (UniqueName: \"kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.071616 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.071639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072071 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072125 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072248 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072826 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.072826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.073187 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.073357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.075556 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.075953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.076735 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.095334 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkckx\" (UniqueName: \"kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.105640 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.245749 4830 generic.go:334] "Generic (PLEG): container finished" podID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" containerID="29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b" exitCode=0 Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.245835 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerDied","Data":"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b"} Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.246884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c65t" event={"ID":"2ef31bb5-6d0b-4739-8129-c6c7c847ee3e","Type":"ContainerDied","Data":"76dbaef0248dcab27e935a8417eddd5c800defbba2ea7d66e75fc89682184404"} Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.245869 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c65t" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.246927 4830 scope.go:117] "RemoveContainer" containerID="29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.254309 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.288752 4830 scope.go:117] "RemoveContainer" containerID="bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.298359 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.311761 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9c65t"] Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.313943 4830 scope.go:117] "RemoveContainer" containerID="16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.349683 4830 scope.go:117] "RemoveContainer" containerID="29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b" Dec 03 22:55:44 crc kubenswrapper[4830]: E1203 22:55:44.350172 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b\": container with ID starting with 29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b not found: ID does not exist" containerID="29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.350236 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b"} err="failed to get container status \"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b\": rpc error: code = NotFound desc = could not find container \"29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b\": container with ID starting with 29045b2e1e92fb56429ae179ed1bd2e69cf6866fade991333efdf24030ec345b not found: ID does not exist" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.350270 4830 scope.go:117] "RemoveContainer" containerID="bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e" Dec 03 22:55:44 crc kubenswrapper[4830]: E1203 22:55:44.350762 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e\": container with ID starting with bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e not found: ID does not exist" containerID="bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.350798 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e"} err="failed to get container status \"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e\": rpc error: code = NotFound desc = could not find container \"bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e\": container with ID starting with bf1d18213b67df6035babccb2a9a806709a78dae85f580b5f7e8ceb4eb39551e not found: ID does not exist" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.350824 4830 scope.go:117] "RemoveContainer" containerID="16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55" Dec 03 22:55:44 crc kubenswrapper[4830]: E1203 22:55:44.355651 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55\": container with ID starting with 16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55 not found: ID does not exist" containerID="16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.355706 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55"} err="failed to get container status \"16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55\": rpc error: code = NotFound desc = could not find container \"16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55\": container with ID starting with 16d13a5cd8b9e470c38379fa33fc4ce5b5c286e67abd4d076816a3fc98b2cc55 not found: ID does not exist" Dec 03 22:55:44 crc kubenswrapper[4830]: I1203 22:55:44.829920 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 22:55:45 crc kubenswrapper[4830]: I1203 22:55:45.258009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a8c3b6fd-772d-4354-9076-a56d78d4ad0a","Type":"ContainerStarted","Data":"3b29e7078e10b94e16406bec2356476c64d54375d7cb9f742a255aa2ea8ad1b2"} Dec 03 22:55:45 crc kubenswrapper[4830]: I1203 22:55:45.350704 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef31bb5-6d0b-4739-8129-c6c7c847ee3e" path="/var/lib/kubelet/pods/2ef31bb5-6d0b-4739-8129-c6c7c847ee3e/volumes" Dec 03 22:55:56 crc kubenswrapper[4830]: I1203 22:55:56.681136 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:55:56 crc kubenswrapper[4830]: I1203 22:55:56.681729 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.021365 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.023655 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.076407 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.208990 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld945\" (UniqueName: \"kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.209123 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.209192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.311292 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld945\" (UniqueName: \"kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.311576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.311755 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.312143 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.312189 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.331930 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld945\" (UniqueName: \"kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945\") pod \"redhat-marketplace-82vxt\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:58 crc kubenswrapper[4830]: I1203 22:55:58.403121 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:55:59 crc kubenswrapper[4830]: I1203 22:55:59.004758 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:55:59 crc kubenswrapper[4830]: I1203 22:55:59.415127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerStarted","Data":"64556525461c88dd51604b63196da9d37ca9d34a1eb4bf1bad44c1a1a72618d9"} Dec 03 22:56:01 crc kubenswrapper[4830]: I1203 22:56:01.447820 4830 generic.go:334] "Generic (PLEG): container finished" podID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerID="1cbccd9e368cb4c1591c89357c5a131b5a0c35ff92845b8b036ff9423c78c25f" exitCode=0 Dec 03 22:56:01 crc kubenswrapper[4830]: I1203 22:56:01.447876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerDied","Data":"1cbccd9e368cb4c1591c89357c5a131b5a0c35ff92845b8b036ff9423c78c25f"} Dec 03 22:56:26 crc kubenswrapper[4830]: I1203 22:56:26.681470 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:56:26 crc kubenswrapper[4830]: I1203 22:56:26.682146 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:56:29 crc kubenswrapper[4830]: E1203 22:56:29.325749 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 22:56:29 crc kubenswrapper[4830]: E1203 22:56:29.327349 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkckx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(a8c3b6fd-772d-4354-9076-a56d78d4ad0a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 22:56:29 crc kubenswrapper[4830]: E1203 22:56:29.328746 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" Dec 03 22:56:29 crc kubenswrapper[4830]: I1203 22:56:29.753822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerStarted","Data":"116c5b1d32c806ee92cd5560d43794fdb5981708c35866e1e122c50424d1d833"} Dec 03 22:56:29 crc kubenswrapper[4830]: E1203 22:56:29.759377 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" Dec 03 22:56:30 crc kubenswrapper[4830]: I1203 22:56:30.764736 4830 generic.go:334] "Generic (PLEG): container finished" podID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerID="116c5b1d32c806ee92cd5560d43794fdb5981708c35866e1e122c50424d1d833" exitCode=0 Dec 03 22:56:30 crc kubenswrapper[4830]: I1203 22:56:30.764784 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerDied","Data":"116c5b1d32c806ee92cd5560d43794fdb5981708c35866e1e122c50424d1d833"} Dec 03 22:56:31 crc kubenswrapper[4830]: I1203 22:56:31.777428 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerStarted","Data":"49b4cfb87f02d5dd714ca65638585b9904ace13eef3d4e343cb06a465a3374af"} Dec 03 22:56:31 crc kubenswrapper[4830]: I1203 22:56:31.799183 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-82vxt" podStartSLOduration=12.211923145 podStartE2EDuration="33.79915627s" podCreationTimestamp="2025-12-03 22:55:58 +0000 UTC" firstStartedPulling="2025-12-03 22:56:09.670405051 +0000 UTC m=+3058.666866400" lastFinishedPulling="2025-12-03 22:56:31.257638176 +0000 UTC m=+3080.254099525" observedRunningTime="2025-12-03 22:56:31.796789647 +0000 UTC m=+3080.793250996" watchObservedRunningTime="2025-12-03 22:56:31.79915627 +0000 UTC m=+3080.795617639" Dec 03 22:56:38 crc kubenswrapper[4830]: I1203 22:56:38.403430 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:38 crc kubenswrapper[4830]: I1203 22:56:38.404008 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:38 crc kubenswrapper[4830]: I1203 22:56:38.465678 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:38 crc kubenswrapper[4830]: I1203 22:56:38.897804 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:38 crc kubenswrapper[4830]: I1203 22:56:38.951425 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:56:40 crc kubenswrapper[4830]: I1203 22:56:40.864183 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-82vxt" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="registry-server" containerID="cri-o://49b4cfb87f02d5dd714ca65638585b9904ace13eef3d4e343cb06a465a3374af" gracePeriod=2 Dec 03 22:56:41 crc kubenswrapper[4830]: I1203 22:56:41.895058 4830 generic.go:334] "Generic (PLEG): container finished" podID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerID="49b4cfb87f02d5dd714ca65638585b9904ace13eef3d4e343cb06a465a3374af" exitCode=0 Dec 03 22:56:41 crc kubenswrapper[4830]: I1203 22:56:41.895141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerDied","Data":"49b4cfb87f02d5dd714ca65638585b9904ace13eef3d4e343cb06a465a3374af"} Dec 03 22:56:41 crc kubenswrapper[4830]: I1203 22:56:41.895389 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82vxt" event={"ID":"d65dae50-ab93-49fa-96f9-0f94770a0149","Type":"ContainerDied","Data":"64556525461c88dd51604b63196da9d37ca9d34a1eb4bf1bad44c1a1a72618d9"} Dec 03 22:56:41 crc kubenswrapper[4830]: I1203 22:56:41.895418 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64556525461c88dd51604b63196da9d37ca9d34a1eb4bf1bad44c1a1a72618d9" Dec 03 22:56:41 crc kubenswrapper[4830]: I1203 22:56:41.909138 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.012046 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities\") pod \"d65dae50-ab93-49fa-96f9-0f94770a0149\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.012113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld945\" (UniqueName: \"kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945\") pod \"d65dae50-ab93-49fa-96f9-0f94770a0149\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.012159 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content\") pod \"d65dae50-ab93-49fa-96f9-0f94770a0149\" (UID: \"d65dae50-ab93-49fa-96f9-0f94770a0149\") " Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.012834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities" (OuterVolumeSpecName: "utilities") pod "d65dae50-ab93-49fa-96f9-0f94770a0149" (UID: "d65dae50-ab93-49fa-96f9-0f94770a0149"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.019166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945" (OuterVolumeSpecName: "kube-api-access-ld945") pod "d65dae50-ab93-49fa-96f9-0f94770a0149" (UID: "d65dae50-ab93-49fa-96f9-0f94770a0149"). InnerVolumeSpecName "kube-api-access-ld945". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.038608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d65dae50-ab93-49fa-96f9-0f94770a0149" (UID: "d65dae50-ab93-49fa-96f9-0f94770a0149"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.114956 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.115000 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld945\" (UniqueName: \"kubernetes.io/projected/d65dae50-ab93-49fa-96f9-0f94770a0149-kube-api-access-ld945\") on node \"crc\" DevicePath \"\"" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.115018 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65dae50-ab93-49fa-96f9-0f94770a0149-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.903275 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82vxt" Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.956828 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:56:42 crc kubenswrapper[4830]: I1203 22:56:42.970685 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-82vxt"] Dec 03 22:56:43 crc kubenswrapper[4830]: I1203 22:56:43.358138 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" path="/var/lib/kubelet/pods/d65dae50-ab93-49fa-96f9-0f94770a0149/volumes" Dec 03 22:56:43 crc kubenswrapper[4830]: I1203 22:56:43.885198 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 22:56:45 crc kubenswrapper[4830]: I1203 22:56:45.938988 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a8c3b6fd-772d-4354-9076-a56d78d4ad0a","Type":"ContainerStarted","Data":"93baac9a22186d9402d8a8c7e73e8e7d4cc06439e78c8d47690b77ff64696695"} Dec 03 22:56:45 crc kubenswrapper[4830]: I1203 22:56:45.975604 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.947275229 podStartE2EDuration="1m3.975569592s" podCreationTimestamp="2025-12-03 22:55:42 +0000 UTC" firstStartedPulling="2025-12-03 22:55:44.853952163 +0000 UTC m=+3033.850413522" lastFinishedPulling="2025-12-03 22:56:43.882246536 +0000 UTC m=+3092.878707885" observedRunningTime="2025-12-03 22:56:45.961779648 +0000 UTC m=+3094.958241007" watchObservedRunningTime="2025-12-03 22:56:45.975569592 +0000 UTC m=+3094.972030941" Dec 03 22:56:56 crc kubenswrapper[4830]: I1203 22:56:56.681585 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 22:56:56 crc kubenswrapper[4830]: I1203 22:56:56.682099 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 22:56:56 crc kubenswrapper[4830]: I1203 22:56:56.682154 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 22:56:56 crc kubenswrapper[4830]: I1203 22:56:56.683067 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 22:56:56 crc kubenswrapper[4830]: I1203 22:56:56.683124 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" gracePeriod=600 Dec 03 22:56:57 crc kubenswrapper[4830]: I1203 22:56:57.049231 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" exitCode=0 Dec 03 22:56:57 crc kubenswrapper[4830]: I1203 22:56:57.049285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1"} Dec 03 22:56:57 crc kubenswrapper[4830]: I1203 22:56:57.049324 4830 scope.go:117] "RemoveContainer" containerID="c8a87b23e4c7076abba20f763ecb4e8f46e7310b3b10bc11d86d25fed1ce60dc" Dec 03 22:56:57 crc kubenswrapper[4830]: E1203 22:56:57.431904 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:56:58 crc kubenswrapper[4830]: I1203 22:56:58.073473 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:56:58 crc kubenswrapper[4830]: E1203 22:56:58.074164 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:57:12 crc kubenswrapper[4830]: I1203 22:57:12.336894 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:57:12 crc kubenswrapper[4830]: E1203 22:57:12.337927 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:57:24 crc kubenswrapper[4830]: I1203 22:57:24.337179 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:57:24 crc kubenswrapper[4830]: E1203 22:57:24.337930 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:57:35 crc kubenswrapper[4830]: I1203 22:57:35.339124 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:57:35 crc kubenswrapper[4830]: E1203 22:57:35.340177 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:57:46 crc kubenswrapper[4830]: I1203 22:57:46.338734 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:57:46 crc kubenswrapper[4830]: E1203 22:57:46.339636 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:58:00 crc kubenswrapper[4830]: I1203 22:58:00.337821 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:58:00 crc kubenswrapper[4830]: E1203 22:58:00.338886 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:58:12 crc kubenswrapper[4830]: I1203 22:58:12.337187 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:58:12 crc kubenswrapper[4830]: E1203 22:58:12.339421 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:58:23 crc kubenswrapper[4830]: I1203 22:58:23.337189 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:58:23 crc kubenswrapper[4830]: E1203 22:58:23.338831 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:58:38 crc kubenswrapper[4830]: I1203 22:58:38.337459 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:58:38 crc kubenswrapper[4830]: E1203 22:58:38.338117 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:58:52 crc kubenswrapper[4830]: I1203 22:58:52.337825 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:58:52 crc kubenswrapper[4830]: E1203 22:58:52.339118 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:59:07 crc kubenswrapper[4830]: I1203 22:59:07.337933 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:59:07 crc kubenswrapper[4830]: E1203 22:59:07.338877 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:59:18 crc kubenswrapper[4830]: I1203 22:59:18.337568 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:59:18 crc kubenswrapper[4830]: E1203 22:59:18.339106 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:59:32 crc kubenswrapper[4830]: I1203 22:59:32.336680 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:59:32 crc kubenswrapper[4830]: E1203 22:59:32.337590 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:59:44 crc kubenswrapper[4830]: I1203 22:59:44.337635 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:59:44 crc kubenswrapper[4830]: E1203 22:59:44.338415 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 22:59:59 crc kubenswrapper[4830]: I1203 22:59:59.338109 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 22:59:59 crc kubenswrapper[4830]: E1203 22:59:59.339374 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.147665 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn"] Dec 03 23:00:00 crc kubenswrapper[4830]: E1203 23:00:00.148122 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="extract-utilities" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.148138 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="extract-utilities" Dec 03 23:00:00 crc kubenswrapper[4830]: E1203 23:00:00.148145 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="extract-content" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.148153 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="extract-content" Dec 03 23:00:00 crc kubenswrapper[4830]: E1203 23:00:00.148183 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="registry-server" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.148189 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="registry-server" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.148384 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65dae50-ab93-49fa-96f9-0f94770a0149" containerName="registry-server" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.149204 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.151620 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.155461 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.164881 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn"] Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.281971 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.282111 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfwvc\" (UniqueName: \"kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.282156 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.383850 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.384002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.384141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfwvc\" (UniqueName: \"kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.384724 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.391173 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.409300 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfwvc\" (UniqueName: \"kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc\") pod \"collect-profiles-29413380-r5nbn\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.472024 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:00 crc kubenswrapper[4830]: I1203 23:00:00.958230 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn"] Dec 03 23:00:01 crc kubenswrapper[4830]: I1203 23:00:01.884114 4830 generic.go:334] "Generic (PLEG): container finished" podID="aa173630-5339-4850-8e61-e084c26053d3" containerID="8d5ef9b7d416a976f15422dfadfb9152c31b18d8c6e92bf8913e1efd2b589038" exitCode=0 Dec 03 23:00:01 crc kubenswrapper[4830]: I1203 23:00:01.884312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" event={"ID":"aa173630-5339-4850-8e61-e084c26053d3","Type":"ContainerDied","Data":"8d5ef9b7d416a976f15422dfadfb9152c31b18d8c6e92bf8913e1efd2b589038"} Dec 03 23:00:01 crc kubenswrapper[4830]: I1203 23:00:01.884445 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" event={"ID":"aa173630-5339-4850-8e61-e084c26053d3","Type":"ContainerStarted","Data":"17cf5400ebfa0ee171dde1418c330dfb0b922ab33261e18f9cb6d7c050d2e336"} Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.489868 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.649235 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume\") pod \"aa173630-5339-4850-8e61-e084c26053d3\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.649412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfwvc\" (UniqueName: \"kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc\") pod \"aa173630-5339-4850-8e61-e084c26053d3\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.649466 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume\") pod \"aa173630-5339-4850-8e61-e084c26053d3\" (UID: \"aa173630-5339-4850-8e61-e084c26053d3\") " Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.650174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "aa173630-5339-4850-8e61-e084c26053d3" (UID: "aa173630-5339-4850-8e61-e084c26053d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.654840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aa173630-5339-4850-8e61-e084c26053d3" (UID: "aa173630-5339-4850-8e61-e084c26053d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.654920 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc" (OuterVolumeSpecName: "kube-api-access-rfwvc") pod "aa173630-5339-4850-8e61-e084c26053d3" (UID: "aa173630-5339-4850-8e61-e084c26053d3"). InnerVolumeSpecName "kube-api-access-rfwvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.751649 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aa173630-5339-4850-8e61-e084c26053d3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.751696 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfwvc\" (UniqueName: \"kubernetes.io/projected/aa173630-5339-4850-8e61-e084c26053d3-kube-api-access-rfwvc\") on node \"crc\" DevicePath \"\"" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.751711 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa173630-5339-4850-8e61-e084c26053d3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.907166 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" event={"ID":"aa173630-5339-4850-8e61-e084c26053d3","Type":"ContainerDied","Data":"17cf5400ebfa0ee171dde1418c330dfb0b922ab33261e18f9cb6d7c050d2e336"} Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.907593 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17cf5400ebfa0ee171dde1418c330dfb0b922ab33261e18f9cb6d7c050d2e336" Dec 03 23:00:03 crc kubenswrapper[4830]: I1203 23:00:03.907211 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-r5nbn" Dec 03 23:00:04 crc kubenswrapper[4830]: I1203 23:00:04.593183 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl"] Dec 03 23:00:04 crc kubenswrapper[4830]: I1203 23:00:04.602281 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-59kgl"] Dec 03 23:00:05 crc kubenswrapper[4830]: I1203 23:00:05.347430 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd1ff4c-8898-4d90-8f38-74e7c94f57da" path="/var/lib/kubelet/pods/4dd1ff4c-8898-4d90-8f38-74e7c94f57da/volumes" Dec 03 23:00:14 crc kubenswrapper[4830]: I1203 23:00:14.337072 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:00:14 crc kubenswrapper[4830]: E1203 23:00:14.337957 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:00:27 crc kubenswrapper[4830]: I1203 23:00:27.337478 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:00:27 crc kubenswrapper[4830]: E1203 23:00:27.338316 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:00:29 crc kubenswrapper[4830]: I1203 23:00:29.344878 4830 scope.go:117] "RemoveContainer" containerID="f2f69b519ed5a3184685e59e69cc6b6537c08f223ce6b678946d0cbd4da13ff3" Dec 03 23:00:42 crc kubenswrapper[4830]: I1203 23:00:42.336784 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:00:42 crc kubenswrapper[4830]: E1203 23:00:42.337386 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:00:55 crc kubenswrapper[4830]: I1203 23:00:55.338262 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:00:55 crc kubenswrapper[4830]: E1203 23:00:55.340965 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.160686 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29413381-s6q5j"] Dec 03 23:01:00 crc kubenswrapper[4830]: E1203 23:01:00.161863 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa173630-5339-4850-8e61-e084c26053d3" containerName="collect-profiles" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.161881 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa173630-5339-4850-8e61-e084c26053d3" containerName="collect-profiles" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.162202 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa173630-5339-4850-8e61-e084c26053d3" containerName="collect-profiles" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.163161 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.174947 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413381-s6q5j"] Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.361389 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.361559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggqjm\" (UniqueName: \"kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.361601 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.362718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.464855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.464972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggqjm\" (UniqueName: \"kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.465014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.465115 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.471211 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.471460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.473129 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.481258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggqjm\" (UniqueName: \"kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm\") pod \"keystone-cron-29413381-s6q5j\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:00 crc kubenswrapper[4830]: I1203 23:01:00.485047 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:01 crc kubenswrapper[4830]: I1203 23:01:01.498503 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413381-s6q5j"] Dec 03 23:01:02 crc kubenswrapper[4830]: I1203 23:01:02.484335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-s6q5j" event={"ID":"29ffb417-4cef-4a91-a9f7-74fcc99e14df","Type":"ContainerStarted","Data":"4d714d56c033c04966e0ff41b7ed80b5c1b4e2d67b5b6106804e456571a66cf4"} Dec 03 23:01:02 crc kubenswrapper[4830]: I1203 23:01:02.484779 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-s6q5j" event={"ID":"29ffb417-4cef-4a91-a9f7-74fcc99e14df","Type":"ContainerStarted","Data":"2fbfb11919dddd5268036db16c456da3a3a60019a6245f3f6b6ee9c615f32d5d"} Dec 03 23:01:02 crc kubenswrapper[4830]: I1203 23:01:02.506727 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29413381-s6q5j" podStartSLOduration=2.506703979 podStartE2EDuration="2.506703979s" podCreationTimestamp="2025-12-03 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:01:02.49860791 +0000 UTC m=+3351.495069259" watchObservedRunningTime="2025-12-03 23:01:02.506703979 +0000 UTC m=+3351.503165328" Dec 03 23:01:04 crc kubenswrapper[4830]: I1203 23:01:04.512577 4830 generic.go:334] "Generic (PLEG): container finished" podID="29ffb417-4cef-4a91-a9f7-74fcc99e14df" containerID="4d714d56c033c04966e0ff41b7ed80b5c1b4e2d67b5b6106804e456571a66cf4" exitCode=0 Dec 03 23:01:04 crc kubenswrapper[4830]: I1203 23:01:04.512674 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-s6q5j" event={"ID":"29ffb417-4cef-4a91-a9f7-74fcc99e14df","Type":"ContainerDied","Data":"4d714d56c033c04966e0ff41b7ed80b5c1b4e2d67b5b6106804e456571a66cf4"} Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.137622 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.293608 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys\") pod \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.293794 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggqjm\" (UniqueName: \"kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm\") pod \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.293829 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data\") pod \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.293941 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle\") pod \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\" (UID: \"29ffb417-4cef-4a91-a9f7-74fcc99e14df\") " Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.300297 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm" (OuterVolumeSpecName: "kube-api-access-ggqjm") pod "29ffb417-4cef-4a91-a9f7-74fcc99e14df" (UID: "29ffb417-4cef-4a91-a9f7-74fcc99e14df"). InnerVolumeSpecName "kube-api-access-ggqjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.306787 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "29ffb417-4cef-4a91-a9f7-74fcc99e14df" (UID: "29ffb417-4cef-4a91-a9f7-74fcc99e14df"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.333742 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29ffb417-4cef-4a91-a9f7-74fcc99e14df" (UID: "29ffb417-4cef-4a91-a9f7-74fcc99e14df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.355227 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data" (OuterVolumeSpecName: "config-data") pod "29ffb417-4cef-4a91-a9f7-74fcc99e14df" (UID: "29ffb417-4cef-4a91-a9f7-74fcc99e14df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.396013 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggqjm\" (UniqueName: \"kubernetes.io/projected/29ffb417-4cef-4a91-a9f7-74fcc99e14df-kube-api-access-ggqjm\") on node \"crc\" DevicePath \"\"" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.396045 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.396055 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.396064 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29ffb417-4cef-4a91-a9f7-74fcc99e14df-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.531228 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-s6q5j" event={"ID":"29ffb417-4cef-4a91-a9f7-74fcc99e14df","Type":"ContainerDied","Data":"2fbfb11919dddd5268036db16c456da3a3a60019a6245f3f6b6ee9c615f32d5d"} Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.531271 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fbfb11919dddd5268036db16c456da3a3a60019a6245f3f6b6ee9c615f32d5d" Dec 03 23:01:06 crc kubenswrapper[4830]: I1203 23:01:06.531330 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-s6q5j" Dec 03 23:01:07 crc kubenswrapper[4830]: I1203 23:01:07.337727 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:01:07 crc kubenswrapper[4830]: E1203 23:01:07.338329 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:01:21 crc kubenswrapper[4830]: I1203 23:01:21.348496 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:01:21 crc kubenswrapper[4830]: E1203 23:01:21.349706 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:01:34 crc kubenswrapper[4830]: I1203 23:01:34.336869 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:01:34 crc kubenswrapper[4830]: E1203 23:01:34.337777 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:01:45 crc kubenswrapper[4830]: I1203 23:01:45.337045 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:01:45 crc kubenswrapper[4830]: E1203 23:01:45.338100 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:01:59 crc kubenswrapper[4830]: I1203 23:01:59.040214 4830 generic.go:334] "Generic (PLEG): container finished" podID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" containerID="93baac9a22186d9402d8a8c7e73e8e7d4cc06439e78c8d47690b77ff64696695" exitCode=0 Dec 03 23:01:59 crc kubenswrapper[4830]: I1203 23:01:59.040318 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a8c3b6fd-772d-4354-9076-a56d78d4ad0a","Type":"ContainerDied","Data":"93baac9a22186d9402d8a8c7e73e8e7d4cc06439e78c8d47690b77ff64696695"} Dec 03 23:01:59 crc kubenswrapper[4830]: I1203 23:01:59.336909 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.066805 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c"} Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.606788 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649378 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649445 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649535 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649622 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649647 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649696 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649743 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkckx\" (UniqueName: \"kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649771 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.649804 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key\") pod \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\" (UID: \"a8c3b6fd-772d-4354-9076-a56d78d4ad0a\") " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.650488 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data" (OuterVolumeSpecName: "config-data") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.650722 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.658622 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx" (OuterVolumeSpecName: "kube-api-access-mkckx") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "kube-api-access-mkckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.687616 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.695440 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.696242 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.697553 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.745827 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752611 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752643 4830 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752653 4830 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752663 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752693 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752702 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752711 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkckx\" (UniqueName: \"kubernetes.io/projected/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-kube-api-access-mkckx\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.752723 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.781152 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 23:02:00 crc kubenswrapper[4830]: I1203 23:02:00.854149 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:01 crc kubenswrapper[4830]: I1203 23:02:01.056676 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a8c3b6fd-772d-4354-9076-a56d78d4ad0a" (UID: "a8c3b6fd-772d-4354-9076-a56d78d4ad0a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:02:01 crc kubenswrapper[4830]: I1203 23:02:01.058199 4830 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a8c3b6fd-772d-4354-9076-a56d78d4ad0a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:01 crc kubenswrapper[4830]: I1203 23:02:01.076990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a8c3b6fd-772d-4354-9076-a56d78d4ad0a","Type":"ContainerDied","Data":"3b29e7078e10b94e16406bec2356476c64d54375d7cb9f742a255aa2ea8ad1b2"} Dec 03 23:02:01 crc kubenswrapper[4830]: I1203 23:02:01.077036 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b29e7078e10b94e16406bec2356476c64d54375d7cb9f742a255aa2ea8ad1b2" Dec 03 23:02:01 crc kubenswrapper[4830]: I1203 23:02:01.077071 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.018859 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 23:02:12 crc kubenswrapper[4830]: E1203 23:02:12.020292 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ffb417-4cef-4a91-a9f7-74fcc99e14df" containerName="keystone-cron" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.020313 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ffb417-4cef-4a91-a9f7-74fcc99e14df" containerName="keystone-cron" Dec 03 23:02:12 crc kubenswrapper[4830]: E1203 23:02:12.020329 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" containerName="tempest-tests-tempest-tests-runner" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.020336 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" containerName="tempest-tests-tempest-tests-runner" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.020631 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c3b6fd-772d-4354-9076-a56d78d4ad0a" containerName="tempest-tests-tempest-tests-runner" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.020654 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ffb417-4cef-4a91-a9f7-74fcc99e14df" containerName="keystone-cron" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.021644 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.025730 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w6mk7" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.057313 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.191970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59h9\" (UniqueName: \"kubernetes.io/projected/ec873385-77d4-4549-8670-b2f507bd999b-kube-api-access-t59h9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.192037 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.294107 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59h9\" (UniqueName: \"kubernetes.io/projected/ec873385-77d4-4549-8670-b2f507bd999b-kube-api-access-t59h9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.294170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.294645 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.312316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59h9\" (UniqueName: \"kubernetes.io/projected/ec873385-77d4-4549-8670-b2f507bd999b-kube-api-access-t59h9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.324287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ec873385-77d4-4549-8670-b2f507bd999b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.350865 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.854472 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 23:02:12 crc kubenswrapper[4830]: I1203 23:02:12.864152 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:02:13 crc kubenswrapper[4830]: I1203 23:02:13.190836 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ec873385-77d4-4549-8670-b2f507bd999b","Type":"ContainerStarted","Data":"b5e90a0a1757c5e6d7103e83a992f26dae7a6b1fa83ea21683152f57bd988411"} Dec 03 23:02:14 crc kubenswrapper[4830]: I1203 23:02:14.201766 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ec873385-77d4-4549-8670-b2f507bd999b","Type":"ContainerStarted","Data":"1ca4f29f9e29e72b9e5343839e35a6f9f8f49ae86fc4af58e8664ac194c06c63"} Dec 03 23:02:14 crc kubenswrapper[4830]: I1203 23:02:14.221688 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.347384296 podStartE2EDuration="3.221672695s" podCreationTimestamp="2025-12-03 23:02:11 +0000 UTC" firstStartedPulling="2025-12-03 23:02:12.863658248 +0000 UTC m=+3421.860119607" lastFinishedPulling="2025-12-03 23:02:13.737946647 +0000 UTC m=+3422.734408006" observedRunningTime="2025-12-03 23:02:14.22000598 +0000 UTC m=+3423.216467339" watchObservedRunningTime="2025-12-03 23:02:14.221672695 +0000 UTC m=+3423.218134044" Dec 03 23:02:29 crc kubenswrapper[4830]: I1203 23:02:29.431819 4830 scope.go:117] "RemoveContainer" containerID="1cbccd9e368cb4c1591c89357c5a131b5a0c35ff92845b8b036ff9423c78c25f" Dec 03 23:02:29 crc kubenswrapper[4830]: I1203 23:02:29.463437 4830 scope.go:117] "RemoveContainer" containerID="116c5b1d32c806ee92cd5560d43794fdb5981708c35866e1e122c50424d1d833" Dec 03 23:02:32 crc kubenswrapper[4830]: I1203 23:02:32.956846 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:32 crc kubenswrapper[4830]: I1203 23:02:32.959959 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:32 crc kubenswrapper[4830]: I1203 23:02:32.970446 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.005228 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.005293 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.005654 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fvz\" (UniqueName: \"kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.108412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fvz\" (UniqueName: \"kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.109019 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.109043 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.109962 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.110242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.140789 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fvz\" (UniqueName: \"kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz\") pod \"certified-operators-vfr97\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.287890 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:33 crc kubenswrapper[4830]: I1203 23:02:33.840912 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:33 crc kubenswrapper[4830]: W1203 23:02:33.844864 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf593971_f66c_45b3_b912_cf2afb6eda03.slice/crio-01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7 WatchSource:0}: Error finding container 01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7: Status 404 returned error can't find the container with id 01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7 Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.417034 4830 generic.go:334] "Generic (PLEG): container finished" podID="df593971-f66c-45b3-b912-cf2afb6eda03" containerID="bdd56fcf9d3a87b57f422e916c5d4f8830cb0b8d65b6ec4bf39d83c7ac4e1259" exitCode=0 Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.417150 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerDied","Data":"bdd56fcf9d3a87b57f422e916c5d4f8830cb0b8d65b6ec4bf39d83c7ac4e1259"} Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.417708 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerStarted","Data":"01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7"} Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.952717 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.954940 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:34 crc kubenswrapper[4830]: I1203 23:02:34.969238 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.058537 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.058873 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.059038 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz446\" (UniqueName: \"kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.161093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz446\" (UniqueName: \"kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.161192 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.161219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.161855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.161878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.182879 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz446\" (UniqueName: \"kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446\") pod \"community-operators-smhgn\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.277654 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.504267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerStarted","Data":"271c209b032d93e181642224c4882f9b730f0525c5f2cc410da960b15e82e72c"} Dec 03 23:02:35 crc kubenswrapper[4830]: I1203 23:02:35.823037 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:02:36 crc kubenswrapper[4830]: I1203 23:02:36.518860 4830 generic.go:334] "Generic (PLEG): container finished" podID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerID="123a498e9f3bf5dbcbae8eeeada56c9693a3da4d513361c20add23ea86f5c670" exitCode=0 Dec 03 23:02:36 crc kubenswrapper[4830]: I1203 23:02:36.518972 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerDied","Data":"123a498e9f3bf5dbcbae8eeeada56c9693a3da4d513361c20add23ea86f5c670"} Dec 03 23:02:36 crc kubenswrapper[4830]: I1203 23:02:36.519453 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerStarted","Data":"2229944e1a5c7514f9822b6a5362cad9b693f5349a37de980ac03a11d2db14e5"} Dec 03 23:02:36 crc kubenswrapper[4830]: I1203 23:02:36.522835 4830 generic.go:334] "Generic (PLEG): container finished" podID="df593971-f66c-45b3-b912-cf2afb6eda03" containerID="271c209b032d93e181642224c4882f9b730f0525c5f2cc410da960b15e82e72c" exitCode=0 Dec 03 23:02:36 crc kubenswrapper[4830]: I1203 23:02:36.522883 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerDied","Data":"271c209b032d93e181642224c4882f9b730f0525c5f2cc410da960b15e82e72c"} Dec 03 23:02:37 crc kubenswrapper[4830]: I1203 23:02:37.534919 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerStarted","Data":"f338e44940e3224aa21c39d5e93e7755c7b0a0eb2e3e319e6b36dffcdd966edc"} Dec 03 23:02:37 crc kubenswrapper[4830]: I1203 23:02:37.565470 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vfr97" podStartSLOduration=3.001429727 podStartE2EDuration="5.565449829s" podCreationTimestamp="2025-12-03 23:02:32 +0000 UTC" firstStartedPulling="2025-12-03 23:02:34.419749032 +0000 UTC m=+3443.416210391" lastFinishedPulling="2025-12-03 23:02:36.983769134 +0000 UTC m=+3445.980230493" observedRunningTime="2025-12-03 23:02:37.553412102 +0000 UTC m=+3446.549873451" watchObservedRunningTime="2025-12-03 23:02:37.565449829 +0000 UTC m=+3446.561911188" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.304142 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9twb/must-gather-4gns5"] Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.306246 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.309362 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b9twb"/"kube-root-ca.crt" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.309449 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b9twb"/"default-dockercfg-vrbnh" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.310139 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b9twb"/"openshift-service-ca.crt" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.331764 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b9twb/must-gather-4gns5"] Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.450485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.450647 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmkq\" (UniqueName: \"kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.546032 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerStarted","Data":"ac1c37873304c27eb952b5b64a9cd78c041a8bda4a5d0d01fca908c831474abd"} Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.552495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.552634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmkq\" (UniqueName: \"kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.552927 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.574588 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmkq\" (UniqueName: \"kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq\") pod \"must-gather-4gns5\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:38 crc kubenswrapper[4830]: I1203 23:02:38.628297 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:02:39 crc kubenswrapper[4830]: I1203 23:02:39.611963 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b9twb/must-gather-4gns5"] Dec 03 23:02:39 crc kubenswrapper[4830]: W1203 23:02:39.613558 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0087c4c2_365d_4f9f_b93a_165fb0dd62cb.slice/crio-9b0d5f4548e53e5afdd7aea30df24d885de940f467dbcc73ab5e522d02a2ee27 WatchSource:0}: Error finding container 9b0d5f4548e53e5afdd7aea30df24d885de940f467dbcc73ab5e522d02a2ee27: Status 404 returned error can't find the container with id 9b0d5f4548e53e5afdd7aea30df24d885de940f467dbcc73ab5e522d02a2ee27 Dec 03 23:02:40 crc kubenswrapper[4830]: I1203 23:02:40.563525 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/must-gather-4gns5" event={"ID":"0087c4c2-365d-4f9f-b93a-165fb0dd62cb","Type":"ContainerStarted","Data":"9b0d5f4548e53e5afdd7aea30df24d885de940f467dbcc73ab5e522d02a2ee27"} Dec 03 23:02:40 crc kubenswrapper[4830]: I1203 23:02:40.565232 4830 generic.go:334] "Generic (PLEG): container finished" podID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerID="ac1c37873304c27eb952b5b64a9cd78c041a8bda4a5d0d01fca908c831474abd" exitCode=0 Dec 03 23:02:40 crc kubenswrapper[4830]: I1203 23:02:40.565364 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerDied","Data":"ac1c37873304c27eb952b5b64a9cd78c041a8bda4a5d0d01fca908c831474abd"} Dec 03 23:02:42 crc kubenswrapper[4830]: I1203 23:02:42.593074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerStarted","Data":"74b285fffd72498b24e533a7a39ecebbf7a8356ad0511775fe4081dadf4d6a99"} Dec 03 23:02:42 crc kubenswrapper[4830]: I1203 23:02:42.613380 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smhgn" podStartSLOduration=3.229764946 podStartE2EDuration="8.613354202s" podCreationTimestamp="2025-12-03 23:02:34 +0000 UTC" firstStartedPulling="2025-12-03 23:02:36.522831944 +0000 UTC m=+3445.519293283" lastFinishedPulling="2025-12-03 23:02:41.90642115 +0000 UTC m=+3450.902882539" observedRunningTime="2025-12-03 23:02:42.610295798 +0000 UTC m=+3451.606757157" watchObservedRunningTime="2025-12-03 23:02:42.613354202 +0000 UTC m=+3451.609815591" Dec 03 23:02:43 crc kubenswrapper[4830]: I1203 23:02:43.288455 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:43 crc kubenswrapper[4830]: I1203 23:02:43.288738 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:43 crc kubenswrapper[4830]: I1203 23:02:43.350088 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:43 crc kubenswrapper[4830]: I1203 23:02:43.654499 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:45 crc kubenswrapper[4830]: I1203 23:02:45.282981 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:45 crc kubenswrapper[4830]: I1203 23:02:45.283357 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:46 crc kubenswrapper[4830]: I1203 23:02:46.349527 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-smhgn" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="registry-server" probeResult="failure" output=< Dec 03 23:02:46 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 23:02:46 crc kubenswrapper[4830]: > Dec 03 23:02:46 crc kubenswrapper[4830]: I1203 23:02:46.634354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/must-gather-4gns5" event={"ID":"0087c4c2-365d-4f9f-b93a-165fb0dd62cb","Type":"ContainerStarted","Data":"3aa8c058bce52985c8a3aa76ae00ad7bf01c01880723d480d647b624cc6bcbdf"} Dec 03 23:02:47 crc kubenswrapper[4830]: I1203 23:02:47.644985 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/must-gather-4gns5" event={"ID":"0087c4c2-365d-4f9f-b93a-165fb0dd62cb","Type":"ContainerStarted","Data":"b78f0457877670c6d82176f97b64c58f3f5db8fe7642b0509ef00f14c5828006"} Dec 03 23:02:47 crc kubenswrapper[4830]: I1203 23:02:47.685774 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9twb/must-gather-4gns5" podStartSLOduration=3.163767531 podStartE2EDuration="9.685747468s" podCreationTimestamp="2025-12-03 23:02:38 +0000 UTC" firstStartedPulling="2025-12-03 23:02:39.620018656 +0000 UTC m=+3448.616480005" lastFinishedPulling="2025-12-03 23:02:46.141998593 +0000 UTC m=+3455.138459942" observedRunningTime="2025-12-03 23:02:47.664192264 +0000 UTC m=+3456.660653613" watchObservedRunningTime="2025-12-03 23:02:47.685747468 +0000 UTC m=+3456.682208827" Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.150736 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.151017 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vfr97" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="registry-server" containerID="cri-o://f338e44940e3224aa21c39d5e93e7755c7b0a0eb2e3e319e6b36dffcdd966edc" gracePeriod=2 Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.655869 4830 generic.go:334] "Generic (PLEG): container finished" podID="df593971-f66c-45b3-b912-cf2afb6eda03" containerID="f338e44940e3224aa21c39d5e93e7755c7b0a0eb2e3e319e6b36dffcdd966edc" exitCode=0 Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.655944 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerDied","Data":"f338e44940e3224aa21c39d5e93e7755c7b0a0eb2e3e319e6b36dffcdd966edc"} Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.656233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfr97" event={"ID":"df593971-f66c-45b3-b912-cf2afb6eda03","Type":"ContainerDied","Data":"01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7"} Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.656250 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b7b118f3ca40319b24ec3891a8bbef2512c8f781d9b368c3c77df4c18f9eb7" Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.719469 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.915757 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content\") pod \"df593971-f66c-45b3-b912-cf2afb6eda03\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.916034 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities\") pod \"df593971-f66c-45b3-b912-cf2afb6eda03\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.916077 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85fvz\" (UniqueName: \"kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz\") pod \"df593971-f66c-45b3-b912-cf2afb6eda03\" (UID: \"df593971-f66c-45b3-b912-cf2afb6eda03\") " Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.916743 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities" (OuterVolumeSpecName: "utilities") pod "df593971-f66c-45b3-b912-cf2afb6eda03" (UID: "df593971-f66c-45b3-b912-cf2afb6eda03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.922723 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz" (OuterVolumeSpecName: "kube-api-access-85fvz") pod "df593971-f66c-45b3-b912-cf2afb6eda03" (UID: "df593971-f66c-45b3-b912-cf2afb6eda03"). InnerVolumeSpecName "kube-api-access-85fvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:02:48 crc kubenswrapper[4830]: I1203 23:02:48.965999 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df593971-f66c-45b3-b912-cf2afb6eda03" (UID: "df593971-f66c-45b3-b912-cf2afb6eda03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.018798 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.018835 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85fvz\" (UniqueName: \"kubernetes.io/projected/df593971-f66c-45b3-b912-cf2afb6eda03-kube-api-access-85fvz\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.018851 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df593971-f66c-45b3-b912-cf2afb6eda03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.664409 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfr97" Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.694639 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:49 crc kubenswrapper[4830]: I1203 23:02:49.706470 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vfr97"] Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.291658 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9twb/crc-debug-64z9r"] Dec 03 23:02:50 crc kubenswrapper[4830]: E1203 23:02:50.292266 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="extract-utilities" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.292281 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="extract-utilities" Dec 03 23:02:50 crc kubenswrapper[4830]: E1203 23:02:50.292327 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="registry-server" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.292333 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="registry-server" Dec 03 23:02:50 crc kubenswrapper[4830]: E1203 23:02:50.292345 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="extract-content" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.292351 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="extract-content" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.292565 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" containerName="registry-server" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.293265 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.445153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.446058 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sctlf\" (UniqueName: \"kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.547659 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.547801 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.547903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sctlf\" (UniqueName: \"kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.568987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sctlf\" (UniqueName: \"kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf\") pod \"crc-debug-64z9r\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.611323 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:02:50 crc kubenswrapper[4830]: W1203 23:02:50.647531 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f265a45_b2cd_4d86_b693_91e5d9fb6e7d.slice/crio-fb51901b2f5f935541e1ab2297e35d673921592d974ca55bcfa0a8f9ad3aeea0 WatchSource:0}: Error finding container fb51901b2f5f935541e1ab2297e35d673921592d974ca55bcfa0a8f9ad3aeea0: Status 404 returned error can't find the container with id fb51901b2f5f935541e1ab2297e35d673921592d974ca55bcfa0a8f9ad3aeea0 Dec 03 23:02:50 crc kubenswrapper[4830]: I1203 23:02:50.682703 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-64z9r" event={"ID":"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d","Type":"ContainerStarted","Data":"fb51901b2f5f935541e1ab2297e35d673921592d974ca55bcfa0a8f9ad3aeea0"} Dec 03 23:02:51 crc kubenswrapper[4830]: I1203 23:02:51.351309 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df593971-f66c-45b3-b912-cf2afb6eda03" path="/var/lib/kubelet/pods/df593971-f66c-45b3-b912-cf2afb6eda03/volumes" Dec 03 23:02:55 crc kubenswrapper[4830]: I1203 23:02:55.333866 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:55 crc kubenswrapper[4830]: I1203 23:02:55.415495 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:02:56 crc kubenswrapper[4830]: I1203 23:02:56.550218 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:02:56 crc kubenswrapper[4830]: I1203 23:02:56.746155 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smhgn" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="registry-server" containerID="cri-o://74b285fffd72498b24e533a7a39ecebbf7a8356ad0511775fe4081dadf4d6a99" gracePeriod=2 Dec 03 23:02:57 crc kubenswrapper[4830]: I1203 23:02:57.757006 4830 generic.go:334] "Generic (PLEG): container finished" podID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerID="74b285fffd72498b24e533a7a39ecebbf7a8356ad0511775fe4081dadf4d6a99" exitCode=0 Dec 03 23:02:57 crc kubenswrapper[4830]: I1203 23:02:57.757054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerDied","Data":"74b285fffd72498b24e533a7a39ecebbf7a8356ad0511775fe4081dadf4d6a99"} Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.415257 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.562458 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content\") pod \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.562578 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities\") pod \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.562641 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz446\" (UniqueName: \"kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446\") pod \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\" (UID: \"c3526c5d-f3be-4dfb-b7e6-58c25f52b854\") " Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.564047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities" (OuterVolumeSpecName: "utilities") pod "c3526c5d-f3be-4dfb-b7e6-58c25f52b854" (UID: "c3526c5d-f3be-4dfb-b7e6-58c25f52b854"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.568978 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446" (OuterVolumeSpecName: "kube-api-access-gz446") pod "c3526c5d-f3be-4dfb-b7e6-58c25f52b854" (UID: "c3526c5d-f3be-4dfb-b7e6-58c25f52b854"). InnerVolumeSpecName "kube-api-access-gz446". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.606046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3526c5d-f3be-4dfb-b7e6-58c25f52b854" (UID: "c3526c5d-f3be-4dfb-b7e6-58c25f52b854"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.665042 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.665094 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.665109 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz446\" (UniqueName: \"kubernetes.io/projected/c3526c5d-f3be-4dfb-b7e6-58c25f52b854-kube-api-access-gz446\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.835282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smhgn" event={"ID":"c3526c5d-f3be-4dfb-b7e6-58c25f52b854","Type":"ContainerDied","Data":"2229944e1a5c7514f9822b6a5362cad9b693f5349a37de980ac03a11d2db14e5"} Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.835343 4830 scope.go:117] "RemoveContainer" containerID="74b285fffd72498b24e533a7a39ecebbf7a8356ad0511775fe4081dadf4d6a99" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.835501 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smhgn" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.847721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-64z9r" event={"ID":"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d","Type":"ContainerStarted","Data":"f577558fb517bfc0105cf067798df950a0d8f30597d1b8a84c94ae7039d83960"} Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.880827 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9twb/crc-debug-64z9r" podStartSLOduration=1.582753679 podStartE2EDuration="13.880801576s" podCreationTimestamp="2025-12-03 23:02:50 +0000 UTC" firstStartedPulling="2025-12-03 23:02:50.649302055 +0000 UTC m=+3459.645763404" lastFinishedPulling="2025-12-03 23:03:02.947349952 +0000 UTC m=+3471.943811301" observedRunningTime="2025-12-03 23:03:03.863726913 +0000 UTC m=+3472.860188262" watchObservedRunningTime="2025-12-03 23:03:03.880801576 +0000 UTC m=+3472.877262925" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.954072 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.956697 4830 scope.go:117] "RemoveContainer" containerID="ac1c37873304c27eb952b5b64a9cd78c041a8bda4a5d0d01fca908c831474abd" Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.981068 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smhgn"] Dec 03 23:03:03 crc kubenswrapper[4830]: I1203 23:03:03.997064 4830 scope.go:117] "RemoveContainer" containerID="123a498e9f3bf5dbcbae8eeeada56c9693a3da4d513361c20add23ea86f5c670" Dec 03 23:03:05 crc kubenswrapper[4830]: I1203 23:03:05.348562 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" path="/var/lib/kubelet/pods/c3526c5d-f3be-4dfb-b7e6-58c25f52b854/volumes" Dec 03 23:03:29 crc kubenswrapper[4830]: I1203 23:03:29.553743 4830 scope.go:117] "RemoveContainer" containerID="49b4cfb87f02d5dd714ca65638585b9904ace13eef3d4e343cb06a465a3374af" Dec 03 23:03:46 crc kubenswrapper[4830]: I1203 23:03:46.386116 4830 generic.go:334] "Generic (PLEG): container finished" podID="6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" containerID="f577558fb517bfc0105cf067798df950a0d8f30597d1b8a84c94ae7039d83960" exitCode=0 Dec 03 23:03:46 crc kubenswrapper[4830]: I1203 23:03:46.386200 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-64z9r" event={"ID":"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d","Type":"ContainerDied","Data":"f577558fb517bfc0105cf067798df950a0d8f30597d1b8a84c94ae7039d83960"} Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.511120 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.553609 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-64z9r"] Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.567437 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-64z9r"] Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.696838 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sctlf\" (UniqueName: \"kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf\") pod \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.696938 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host\") pod \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\" (UID: \"6f265a45-b2cd-4d86-b693-91e5d9fb6e7d\") " Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.697447 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host" (OuterVolumeSpecName: "host") pod "6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" (UID: "6f265a45-b2cd-4d86-b693-91e5d9fb6e7d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.703153 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf" (OuterVolumeSpecName: "kube-api-access-sctlf") pod "6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" (UID: "6f265a45-b2cd-4d86-b693-91e5d9fb6e7d"). InnerVolumeSpecName "kube-api-access-sctlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.800574 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sctlf\" (UniqueName: \"kubernetes.io/projected/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-kube-api-access-sctlf\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:47 crc kubenswrapper[4830]: I1203 23:03:47.800620 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.408173 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb51901b2f5f935541e1ab2297e35d673921592d974ca55bcfa0a8f9ad3aeea0" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.408226 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-64z9r" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.748837 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9twb/crc-debug-9dqhs"] Dec 03 23:03:48 crc kubenswrapper[4830]: E1203 23:03:48.749260 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="extract-utilities" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749277 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="extract-utilities" Dec 03 23:03:48 crc kubenswrapper[4830]: E1203 23:03:48.749289 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" containerName="container-00" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749295 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" containerName="container-00" Dec 03 23:03:48 crc kubenswrapper[4830]: E1203 23:03:48.749305 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="extract-content" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749311 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="extract-content" Dec 03 23:03:48 crc kubenswrapper[4830]: E1203 23:03:48.749320 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="registry-server" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749325 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="registry-server" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749583 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3526c5d-f3be-4dfb-b7e6-58c25f52b854" containerName="registry-server" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.749605 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" containerName="container-00" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.750308 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.920841 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:48 crc kubenswrapper[4830]: I1203 23:03:48.921625 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnr5z\" (UniqueName: \"kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.024554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.024703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.025504 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnr5z\" (UniqueName: \"kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.057690 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnr5z\" (UniqueName: \"kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z\") pod \"crc-debug-9dqhs\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.068790 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.347895 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f265a45-b2cd-4d86-b693-91e5d9fb6e7d" path="/var/lib/kubelet/pods/6f265a45-b2cd-4d86-b693-91e5d9fb6e7d/volumes" Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.418904 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" event={"ID":"f99b3c8f-34be-4333-b955-ea03129855a3","Type":"ContainerStarted","Data":"5899b57330aad54a122cfb7a6be2e4bca8c5ded4c49b7022e0e593dfbd467db7"} Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.418954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" event={"ID":"f99b3c8f-34be-4333-b955-ea03129855a3","Type":"ContainerStarted","Data":"e6be3307d609e349a0bc1183f61c83e42e6f81f9b8eaa0f037678c5bf1a27521"} Dec 03 23:03:49 crc kubenswrapper[4830]: I1203 23:03:49.435423 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" podStartSLOduration=1.435408318 podStartE2EDuration="1.435408318s" podCreationTimestamp="2025-12-03 23:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:03:49.430640979 +0000 UTC m=+3518.427102328" watchObservedRunningTime="2025-12-03 23:03:49.435408318 +0000 UTC m=+3518.431869667" Dec 03 23:03:50 crc kubenswrapper[4830]: I1203 23:03:50.430725 4830 generic.go:334] "Generic (PLEG): container finished" podID="f99b3c8f-34be-4333-b955-ea03129855a3" containerID="5899b57330aad54a122cfb7a6be2e4bca8c5ded4c49b7022e0e593dfbd467db7" exitCode=0 Dec 03 23:03:50 crc kubenswrapper[4830]: I1203 23:03:50.430786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" event={"ID":"f99b3c8f-34be-4333-b955-ea03129855a3","Type":"ContainerDied","Data":"5899b57330aad54a122cfb7a6be2e4bca8c5ded4c49b7022e0e593dfbd467db7"} Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.550218 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.586597 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-9dqhs"] Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.595907 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-9dqhs"] Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.679323 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host\") pod \"f99b3c8f-34be-4333-b955-ea03129855a3\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.679592 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnr5z\" (UniqueName: \"kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z\") pod \"f99b3c8f-34be-4333-b955-ea03129855a3\" (UID: \"f99b3c8f-34be-4333-b955-ea03129855a3\") " Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.679715 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host" (OuterVolumeSpecName: "host") pod "f99b3c8f-34be-4333-b955-ea03129855a3" (UID: "f99b3c8f-34be-4333-b955-ea03129855a3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.680137 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f99b3c8f-34be-4333-b955-ea03129855a3-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.685255 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z" (OuterVolumeSpecName: "kube-api-access-bnr5z") pod "f99b3c8f-34be-4333-b955-ea03129855a3" (UID: "f99b3c8f-34be-4333-b955-ea03129855a3"). InnerVolumeSpecName "kube-api-access-bnr5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:03:51 crc kubenswrapper[4830]: I1203 23:03:51.782326 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnr5z\" (UniqueName: \"kubernetes.io/projected/f99b3c8f-34be-4333-b955-ea03129855a3-kube-api-access-bnr5z\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.451315 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6be3307d609e349a0bc1183f61c83e42e6f81f9b8eaa0f037678c5bf1a27521" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.451393 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-9dqhs" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.813191 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9twb/crc-debug-xjlrr"] Dec 03 23:03:52 crc kubenswrapper[4830]: E1203 23:03:52.813942 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99b3c8f-34be-4333-b955-ea03129855a3" containerName="container-00" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.813974 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99b3c8f-34be-4333-b955-ea03129855a3" containerName="container-00" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.814169 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99b3c8f-34be-4333-b955-ea03129855a3" containerName="container-00" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.815000 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.904914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9r8c\" (UniqueName: \"kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:52 crc kubenswrapper[4830]: I1203 23:03:52.904963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.006425 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9r8c\" (UniqueName: \"kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.006479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.006669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.027001 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9r8c\" (UniqueName: \"kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c\") pod \"crc-debug-xjlrr\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.135443 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:53 crc kubenswrapper[4830]: W1203 23:03:53.161644 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a9958a_caf7_4d14_b3f2_750b810845cf.slice/crio-651d4091d26e606ee99006627c4889b6765a3e147913d105534813cccde68819 WatchSource:0}: Error finding container 651d4091d26e606ee99006627c4889b6765a3e147913d105534813cccde68819: Status 404 returned error can't find the container with id 651d4091d26e606ee99006627c4889b6765a3e147913d105534813cccde68819 Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.352166 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99b3c8f-34be-4333-b955-ea03129855a3" path="/var/lib/kubelet/pods/f99b3c8f-34be-4333-b955-ea03129855a3/volumes" Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.461928 4830 generic.go:334] "Generic (PLEG): container finished" podID="57a9958a-caf7-4d14-b3f2-750b810845cf" containerID="8931ed241bea86e3682c1d07bd09f40442ebaea4a61e3a9d02c882ae8693b846" exitCode=0 Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.461978 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" event={"ID":"57a9958a-caf7-4d14-b3f2-750b810845cf","Type":"ContainerDied","Data":"8931ed241bea86e3682c1d07bd09f40442ebaea4a61e3a9d02c882ae8693b846"} Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.462013 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" event={"ID":"57a9958a-caf7-4d14-b3f2-750b810845cf","Type":"ContainerStarted","Data":"651d4091d26e606ee99006627c4889b6765a3e147913d105534813cccde68819"} Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.512055 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-xjlrr"] Dec 03 23:03:53 crc kubenswrapper[4830]: I1203 23:03:53.543392 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9twb/crc-debug-xjlrr"] Dec 03 23:03:53 crc kubenswrapper[4830]: E1203 23:03:53.648569 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a9958a_caf7_4d14_b3f2_750b810845cf.slice/crio-8931ed241bea86e3682c1d07bd09f40442ebaea4a61e3a9d02c882ae8693b846.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a9958a_caf7_4d14_b3f2_750b810845cf.slice/crio-conmon-8931ed241bea86e3682c1d07bd09f40442ebaea4a61e3a9d02c882ae8693b846.scope\": RecentStats: unable to find data in memory cache]" Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.591200 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.649803 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9r8c\" (UniqueName: \"kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c\") pod \"57a9958a-caf7-4d14-b3f2-750b810845cf\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.649920 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host\") pod \"57a9958a-caf7-4d14-b3f2-750b810845cf\" (UID: \"57a9958a-caf7-4d14-b3f2-750b810845cf\") " Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.650275 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host" (OuterVolumeSpecName: "host") pod "57a9958a-caf7-4d14-b3f2-750b810845cf" (UID: "57a9958a-caf7-4d14-b3f2-750b810845cf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.651485 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57a9958a-caf7-4d14-b3f2-750b810845cf-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.656772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c" (OuterVolumeSpecName: "kube-api-access-g9r8c") pod "57a9958a-caf7-4d14-b3f2-750b810845cf" (UID: "57a9958a-caf7-4d14-b3f2-750b810845cf"). InnerVolumeSpecName "kube-api-access-g9r8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:03:54 crc kubenswrapper[4830]: I1203 23:03:54.753251 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9r8c\" (UniqueName: \"kubernetes.io/projected/57a9958a-caf7-4d14-b3f2-750b810845cf-kube-api-access-g9r8c\") on node \"crc\" DevicePath \"\"" Dec 03 23:03:55 crc kubenswrapper[4830]: I1203 23:03:55.348204 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a9958a-caf7-4d14-b3f2-750b810845cf" path="/var/lib/kubelet/pods/57a9958a-caf7-4d14-b3f2-750b810845cf/volumes" Dec 03 23:03:55 crc kubenswrapper[4830]: I1203 23:03:55.485414 4830 scope.go:117] "RemoveContainer" containerID="8931ed241bea86e3682c1d07bd09f40442ebaea4a61e3a9d02c882ae8693b846" Dec 03 23:03:55 crc kubenswrapper[4830]: I1203 23:03:55.485662 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/crc-debug-xjlrr" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.220949 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/init-config-reloader/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.387639 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/init-config-reloader/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.393144 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/alertmanager/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.413683 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/config-reloader/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.574902 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6444ff6d8d-kq894_9e17dece-7a57-4c56-b128-0316add6808f/barbican-api/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.576119 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6444ff6d8d-kq894_9e17dece-7a57-4c56-b128-0316add6808f/barbican-api-log/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.644600 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f8d56fd8-kd4r4_56894369-6e4d-451e-b510-60c1cad4b111/barbican-keystone-listener/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.863018 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f8d56fd8-kd4r4_56894369-6e4d-451e-b510-60c1cad4b111/barbican-keystone-listener-log/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.883838 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c48ccfdc-rn6l7_f622ae76-ce43-4025-90eb-e609fbe2a004/barbican-worker-log/0.log" Dec 03 23:04:18 crc kubenswrapper[4830]: I1203 23:04:18.905590 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c48ccfdc-rn6l7_f622ae76-ce43-4025-90eb-e609fbe2a004/barbican-worker/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.061847 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7_be2984ba-f7dc-4271-ad04-f59c4ad3729d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.242797 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/ceilometer-central-agent/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.313922 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/ceilometer-notification-agent/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.390729 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/proxy-httpd/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.419372 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/sg-core/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.563166 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_974ffdb3-a522-4b55-bdf0-b935f1378f20/cinder-api/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.609886 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_974ffdb3-a522-4b55-bdf0-b935f1378f20/cinder-api-log/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.778622 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_499f12cf-14df-48d9-b2ee-11691c85e1ed/cinder-scheduler/0.log" Dec 03 23:04:19 crc kubenswrapper[4830]: I1203 23:04:19.825264 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_499f12cf-14df-48d9-b2ee-11691c85e1ed/probe/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.007708 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_74cd90ac-c295-404a-afa5-d2977c397561/cloudkitty-api-log/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.056184 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_74cd90ac-c295-404a-afa5-d2977c397561/cloudkitty-api/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.095198 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_83a885d2-eea8-4a2c-83d7-a0a945597421/loki-compactor/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.245866 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-gvmnk_596170cd-57e9-4665-947d-ddb1549a38e0/loki-distributor/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.303106 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-gbw74_800e6ad6-526b-4134-b759-b9c0d884e3f5/gateway/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.537266 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-tbz5t_bb34bcb7-4a40-4d5b-a5ca-55571c61b999/gateway/0.log" Dec 03 23:04:20 crc kubenswrapper[4830]: I1203 23:04:20.601840 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_5af9b96f-fb0a-482b-9000-3b76a8c6c07c/loki-index-gateway/0.log" Dec 03 23:04:21 crc kubenswrapper[4830]: I1203 23:04:21.113379 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-qg9t9_de378972-d74f-44fe-a727-19bde47f0cbe/loki-query-frontend/0.log" Dec 03 23:04:21 crc kubenswrapper[4830]: I1203 23:04:21.192792 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_09564097-60ae-4b1d-bd03-ba8b5a254167/loki-ingester/0.log" Dec 03 23:04:21 crc kubenswrapper[4830]: I1203 23:04:21.657767 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-548665d79b-xntf7_2d6f2070-f2d3-47d2-b43f-dfdaed23e03b/loki-querier/0.log" Dec 03 23:04:21 crc kubenswrapper[4830]: I1203 23:04:21.861241 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5zttp_30423410-ddd7-4a94-8a3a-b20b6dfd64c8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.168051 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4_6145c52f-1582-4092-a44a-cc665216b2af/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.307884 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/init/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.358855 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/init/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.432806 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/dnsmasq-dns/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.587336 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj_1f140812-ce94-43b3-bdec-466d8a3d2417/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.888142 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_063860c7-63d8-4cec-b4f1-b6b501779d90/glance-log/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.910576 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_063860c7-63d8-4cec-b4f1-b6b501779d90/glance-httpd/0.log" Dec 03 23:04:22 crc kubenswrapper[4830]: I1203 23:04:22.926394 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_6be5d9e4-3136-4c97-89ef-9376c1ef588c/cloudkitty-proc/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.026206 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62/glance-httpd/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.108610 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62/glance-log/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.157523 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs_1c532c81-40fc-4058-bb22-abec161c538a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.240933 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kk4xf_4fc52ab9-68d4-4a61-a92f-8de75a563adc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.490518 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29413381-s6q5j_29ffb417-4cef-4a91-a9f7-74fcc99e14df/keystone-cron/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.619059 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79b896c7bd-gsfgm_ca47aca8-81d5-4c28-b82b-147b7835a87d/keystone-api/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.650970 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4e04514b-2828-4cd0-9fa7-7d5a970957a0/kube-state-metrics/0.log" Dec 03 23:04:23 crc kubenswrapper[4830]: I1203 23:04:23.717268 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7_dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.132788 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57d569655f-t92g7_6d9377f1-1fe5-4451-8224-b5e9e253efa5/neutron-httpd/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.149046 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57d569655f-t92g7_6d9377f1-1fe5-4451-8224-b5e9e253efa5/neutron-api/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.213213 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz_f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.657352 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22dd5f84-7ae4-442e-b6a1-dd27b2d3875d/nova-api-log/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.755274 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4085042f-6768-44ca-be35-b0a9c78655fa/nova-cell0-conductor-conductor/0.log" Dec 03 23:04:24 crc kubenswrapper[4830]: I1203 23:04:24.983757 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22dd5f84-7ae4-442e-b6a1-dd27b2d3875d/nova-api-api/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.007693 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1eea67ad-cea1-4acf-a451-76a490e27693/nova-cell1-conductor-conductor/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.032461 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_33e1d4ce-81f4-4a02-8ac5-384686943b19/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.338576 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cjj5b_fde89bd5-aa9c-44c2-b854-696c3e0f50e7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.398407 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b44e63ea-f87b-48f3-8af7-bc3e35ce5265/nova-metadata-log/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.709967 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4dd8f4ca-85be-49df-b95a-adb609cbbff2/nova-scheduler-scheduler/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.800749 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/mysql-bootstrap/0.log" Dec 03 23:04:25 crc kubenswrapper[4830]: I1203 23:04:25.980022 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/mysql-bootstrap/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.037459 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/galera/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.200542 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/mysql-bootstrap/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.497934 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/mysql-bootstrap/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.504416 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b44e63ea-f87b-48f3-8af7-bc3e35ce5265/nova-metadata-metadata/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.509055 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/galera/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.675110 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_670e6335-d34c-46e4-8b4d-89dbd65c35a7/openstackclient/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.680786 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.680827 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.779381 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5r2qg_3af15ba1-ae94-416f-afe7-534d88ee8a64/openstack-network-exporter/0.log" Dec 03 23:04:26 crc kubenswrapper[4830]: I1203 23:04:26.940547 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nlxm7_6bdf507c-05be-4df8-8c33-85f15c05237c/ovn-controller/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.058789 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server-init/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.251107 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server-init/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.281536 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovs-vswitchd/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.313454 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.597849 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2wbdq_f036d299-7239-400c-b3c4-f20ec8ed1f26/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.615173 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_516cc148-c477-46f0-bc3e-475ad6003486/openstack-network-exporter/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.731763 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_516cc148-c477-46f0-bc3e-475ad6003486/ovn-northd/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.793194 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63965380-d86f-4abf-9c9c-4d5a25ad6754/ovsdbserver-nb/0.log" Dec 03 23:04:27 crc kubenswrapper[4830]: I1203 23:04:27.860653 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63965380-d86f-4abf-9c9c-4d5a25ad6754/openstack-network-exporter/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.107947 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c212e9c4-4562-48b2-9be8-bf00f52a076a/ovsdbserver-sb/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.114277 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c212e9c4-4562-48b2-9be8-bf00f52a076a/openstack-network-exporter/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.333258 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c66b45784-w46xw_7956cb25-0b77-4822-b26e-dd512559f30b/placement-api/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.364845 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c66b45784-w46xw_7956cb25-0b77-4822-b26e-dd512559f30b/placement-log/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.394032 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/init-config-reloader/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.608410 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/config-reloader/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.608941 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/init-config-reloader/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.674192 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/prometheus/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.695937 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/thanos-sidecar/0.log" Dec 03 23:04:28 crc kubenswrapper[4830]: I1203 23:04:28.843079 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/setup-container/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.080069 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/setup-container/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.103803 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/setup-container/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.129020 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/rabbitmq/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.349587 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/rabbitmq/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.365765 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/setup-container/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.458952 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd_2153e26c-6b48-4353-9ee3-ac526f0d76b2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.585928 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6slmb_18583450-269e-412a-99f5-203326569e83/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.789778 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86_c3a26e72-8b75-423c-a151-d576eb7a4128/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:29 crc kubenswrapper[4830]: I1203 23:04:29.969135 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zhhk9_922398e7-7669-44c9-89e4-cf9cfea422c8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.053021 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-wlhvr_79dc5285-4e46-46d1-a708-a1f1623b7448/ssh-known-hosts-edpm-deployment/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.285683 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b449dfbc-dfzlc_614c1318-6703-47ae-89e5-e9b2dd9758e3/proxy-server/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.302294 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b449dfbc-dfzlc_614c1318-6703-47ae-89e5-e9b2dd9758e3/proxy-httpd/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.470295 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wvw9q_19889054-44cb-47a4-a604-a319f1bd25af/swift-ring-rebalance/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.507869 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-auditor/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.537614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-reaper/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.701342 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-replicator/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.717984 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-server/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.737925 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-auditor/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.825163 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-replicator/0.log" Dec 03 23:04:30 crc kubenswrapper[4830]: I1203 23:04:30.953576 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-server/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.011849 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-updater/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.077992 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-auditor/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.163357 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-expirer/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.208614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-server/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.211351 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-replicator/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.345579 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/rsync/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.352945 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-updater/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.444142 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/swift-recon-cron/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.574615 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl_3cbad56b-f578-4ad4-bdb4-13c72261814d/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.764709 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a8c3b6fd-772d-4354-9076-a56d78d4ad0a/tempest-tests-tempest-tests-runner/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.777733 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ec873385-77d4-4549-8670-b2f507bd999b/test-operator-logs-container/0.log" Dec 03 23:04:31 crc kubenswrapper[4830]: I1203 23:04:31.936748 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw_14a22d3c-4a83-41b5-b9e7-7862b7b62ab5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:04:35 crc kubenswrapper[4830]: I1203 23:04:35.186320 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_deb3672e-3fb5-4549-ae27-6f7402c1e3d8/memcached/0.log" Dec 03 23:04:56 crc kubenswrapper[4830]: I1203 23:04:56.681402 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:04:56 crc kubenswrapper[4830]: I1203 23:04:56.681916 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:04:59 crc kubenswrapper[4830]: I1203 23:04:59.375392 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qnp24_cca79891-68e7-4827-8da4-c0570dbca762/kube-rbac-proxy/0.log" Dec 03 23:04:59 crc kubenswrapper[4830]: I1203 23:04:59.527434 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qnp24_cca79891-68e7-4827-8da4-c0570dbca762/manager/0.log" Dec 03 23:04:59 crc kubenswrapper[4830]: I1203 23:04:59.611154 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rhhdr_0ecd210b-fb48-42fe-b161-6583d913b6f8/kube-rbac-proxy/0.log" Dec 03 23:04:59 crc kubenswrapper[4830]: I1203 23:04:59.655365 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rhhdr_0ecd210b-fb48-42fe-b161-6583d913b6f8/manager/0.log" Dec 03 23:04:59 crc kubenswrapper[4830]: I1203 23:04:59.916277 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.089763 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.117660 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.137644 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.582741 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/extract/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.598359 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.613139 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.689985 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-zgtdz_22f7d8a7-b9bf-40ca-aca3-13a370558f38/kube-rbac-proxy/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.787011 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-zgtdz_22f7d8a7-b9bf-40ca-aca3-13a370558f38/manager/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.841715 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qcz7k_e8bc8bcd-fde2-43fd-86ae-814182f2f5ac/kube-rbac-proxy/0.log" Dec 03 23:05:00 crc kubenswrapper[4830]: I1203 23:05:00.948296 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qcz7k_e8bc8bcd-fde2-43fd-86ae-814182f2f5ac/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.044646 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-nggxg_5e5620ec-6ef3-47fc-b88b-06a2f2849b48/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.082473 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-nggxg_5e5620ec-6ef3-47fc-b88b-06a2f2849b48/kube-rbac-proxy/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.273068 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jj85k_a5f4a0b7-d118-45b5-ab87-9f03413d4671/kube-rbac-proxy/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.274638 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jj85k_a5f4a0b7-d118-45b5-ab87-9f03413d4671/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.376108 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pd68x_4cf3851e-6624-48c2-aa71-e799e6b6b685/kube-rbac-proxy/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.502806 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-5t4bj_b0dc8ce5-ac38-4bac-8026-5ca446e16340/kube-rbac-proxy/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.586068 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-5t4bj_b0dc8ce5-ac38-4bac-8026-5ca446e16340/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.665037 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pd68x_4cf3851e-6624-48c2-aa71-e799e6b6b685/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.789807 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4fshq_e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae/kube-rbac-proxy/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.843245 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4fshq_e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae/manager/0.log" Dec 03 23:05:01 crc kubenswrapper[4830]: I1203 23:05:01.915902 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-8wlrd_01175cc5-e6fa-4e26-b76b-6b7e2a71d51a/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.017154 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-8wlrd_01175cc5-e6fa-4e26-b76b-6b7e2a71d51a/manager/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.109401 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48cc9_325f811a-891b-48ae-bde4-a72e7580c925/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.290712 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48cc9_325f811a-891b-48ae-bde4-a72e7580c925/manager/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.352997 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v2gwm_90ea4083-18d1-4ace-bcc6-81489c41f117/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.489813 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v2gwm_90ea4083-18d1-4ace-bcc6-81489c41f117/manager/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.521396 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-526ng_28b1972b-42aa-4470-8be6-240b219e5975/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.693824 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-526ng_28b1972b-42aa-4470-8be6-240b219e5975/manager/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.768483 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rmd7h_d18e5b1e-653d-4c0e-928f-a2d60b2af855/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.852759 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rmd7h_d18e5b1e-653d-4c0e-928f-a2d60b2af855/manager/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.952050 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h_7d1112c1-ffce-45e1-94a4-3aad2ae50fe5/kube-rbac-proxy/0.log" Dec 03 23:05:02 crc kubenswrapper[4830]: I1203 23:05:02.991813 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h_7d1112c1-ffce-45e1-94a4-3aad2ae50fe5/manager/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.401301 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8545cc8fb-sm7mq_dd670738-8808-4f9b-8fea-98c4ab57fb06/operator/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.473755 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pm2p5_e3d5735a-003c-4c35-9239-c80e7e6dbafc/registry-server/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.593649 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-vwd82_908c7892-9ff8-4d17-86ea-2daf891ea90b/kube-rbac-proxy/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.732308 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-vwd82_908c7892-9ff8-4d17-86ea-2daf891ea90b/manager/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.829024 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rs6wd_55d8936f-55fc-4a92-b8a2-c393b6b46eeb/kube-rbac-proxy/0.log" Dec 03 23:05:03 crc kubenswrapper[4830]: I1203 23:05:03.861054 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rs6wd_55d8936f-55fc-4a92-b8a2-c393b6b46eeb/manager/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.007741 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h5tll_27ed445d-9111-479a-8dc5-5808e0af45be/operator/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.141195 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-sqmr8_84a09470-19ba-4bef-b0de-1fa4df1561ae/kube-rbac-proxy/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.238016 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bb8cf96cb-6vrpp_0c670280-553c-4251-ac28-04fdd66313a7/manager/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.443785 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-sqmr8_84a09470-19ba-4bef-b0de-1fa4df1561ae/manager/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.545133 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-59779d887b-2cqbq_c0f0376d-c348-4b7b-b4e1-f8717ea05299/kube-rbac-proxy/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.561286 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-7bhdq_d5a49c34-e03d-49b5-a5a8-507af8ce99be/kube-rbac-proxy/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.649192 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-7bhdq_d5a49c34-e03d-49b5-a5a8-507af8ce99be/manager/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.836319 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dtdgr_dec06c39-2a96-4fc6-a2e2-ad865fc394d9/kube-rbac-proxy/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.932411 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dtdgr_dec06c39-2a96-4fc6-a2e2-ad865fc394d9/manager/0.log" Dec 03 23:05:04 crc kubenswrapper[4830]: I1203 23:05:04.955715 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-59779d887b-2cqbq_c0f0376d-c348-4b7b-b4e1-f8717ea05299/manager/0.log" Dec 03 23:05:25 crc kubenswrapper[4830]: I1203 23:05:25.292696 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c2sm5_c8f89cbd-cef8-468a-973d-6e513dcb4e09/control-plane-machine-set-operator/0.log" Dec 03 23:05:25 crc kubenswrapper[4830]: I1203 23:05:25.479175 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c8f8r_b9ff0d92-ab2f-4815-9659-7b4507d64344/kube-rbac-proxy/0.log" Dec 03 23:05:25 crc kubenswrapper[4830]: I1203 23:05:25.552918 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c8f8r_b9ff0d92-ab2f-4815-9659-7b4507d64344/machine-api-operator/0.log" Dec 03 23:05:26 crc kubenswrapper[4830]: I1203 23:05:26.681390 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:05:26 crc kubenswrapper[4830]: I1203 23:05:26.681788 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:05:26 crc kubenswrapper[4830]: I1203 23:05:26.681850 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 23:05:26 crc kubenswrapper[4830]: I1203 23:05:26.682880 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:05:26 crc kubenswrapper[4830]: I1203 23:05:26.682954 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c" gracePeriod=600 Dec 03 23:05:27 crc kubenswrapper[4830]: I1203 23:05:27.452375 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c" exitCode=0 Dec 03 23:05:27 crc kubenswrapper[4830]: I1203 23:05:27.452457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c"} Dec 03 23:05:27 crc kubenswrapper[4830]: I1203 23:05:27.453062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a"} Dec 03 23:05:27 crc kubenswrapper[4830]: I1203 23:05:27.453130 4830 scope.go:117] "RemoveContainer" containerID="42fd2db3bd0a60b1bf6a8b90136aece72ea2d18191fb03eec0d2d95096992ad1" Dec 03 23:05:38 crc kubenswrapper[4830]: I1203 23:05:38.850941 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-zbjs6_361b0a92-2587-4f96-a941-f9c50fd46e10/cert-manager-controller/0.log" Dec 03 23:05:39 crc kubenswrapper[4830]: I1203 23:05:39.102920 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-5c749_d84ca934-7bba-4889-b4a6-feec21575832/cert-manager-cainjector/0.log" Dec 03 23:05:39 crc kubenswrapper[4830]: I1203 23:05:39.255033 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sr4sh_812edfa4-0a35-4dce-b14c-3addb5812eb7/cert-manager-webhook/0.log" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.756068 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:05:41 crc kubenswrapper[4830]: E1203 23:05:41.757236 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a9958a-caf7-4d14-b3f2-750b810845cf" containerName="container-00" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.757257 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a9958a-caf7-4d14-b3f2-750b810845cf" containerName="container-00" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.757541 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a9958a-caf7-4d14-b3f2-750b810845cf" containerName="container-00" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.759579 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.786923 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.859530 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.860334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.860423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txdmc\" (UniqueName: \"kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.963006 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.963134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txdmc\" (UniqueName: \"kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.963325 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.963615 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.963855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:41 crc kubenswrapper[4830]: I1203 23:05:41.985278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txdmc\" (UniqueName: \"kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc\") pod \"redhat-operators-q8ppp\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:42 crc kubenswrapper[4830]: I1203 23:05:42.094785 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:42 crc kubenswrapper[4830]: I1203 23:05:42.708747 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:05:43 crc kubenswrapper[4830]: I1203 23:05:43.642820 4830 generic.go:334] "Generic (PLEG): container finished" podID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerID="019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41" exitCode=0 Dec 03 23:05:43 crc kubenswrapper[4830]: I1203 23:05:43.642970 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerDied","Data":"019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41"} Dec 03 23:05:43 crc kubenswrapper[4830]: I1203 23:05:43.643179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerStarted","Data":"341aefa73c77b816d718f5354209a110d2857a3bf4f212d2a8188aa3f630f236"} Dec 03 23:05:44 crc kubenswrapper[4830]: I1203 23:05:44.655744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerStarted","Data":"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f"} Dec 03 23:05:47 crc kubenswrapper[4830]: I1203 23:05:47.693409 4830 generic.go:334] "Generic (PLEG): container finished" podID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerID="db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f" exitCode=0 Dec 03 23:05:47 crc kubenswrapper[4830]: I1203 23:05:47.693502 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerDied","Data":"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f"} Dec 03 23:05:48 crc kubenswrapper[4830]: I1203 23:05:48.705657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerStarted","Data":"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a"} Dec 03 23:05:48 crc kubenswrapper[4830]: I1203 23:05:48.733406 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q8ppp" podStartSLOduration=3.279633325 podStartE2EDuration="7.733382069s" podCreationTimestamp="2025-12-03 23:05:41 +0000 UTC" firstStartedPulling="2025-12-03 23:05:43.644610153 +0000 UTC m=+3632.641071502" lastFinishedPulling="2025-12-03 23:05:48.098358897 +0000 UTC m=+3637.094820246" observedRunningTime="2025-12-03 23:05:48.721834951 +0000 UTC m=+3637.718296310" watchObservedRunningTime="2025-12-03 23:05:48.733382069 +0000 UTC m=+3637.729843428" Dec 03 23:05:52 crc kubenswrapper[4830]: I1203 23:05:52.095563 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:52 crc kubenswrapper[4830]: I1203 23:05:52.097394 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:05:53 crc kubenswrapper[4830]: I1203 23:05:53.149299 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q8ppp" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="registry-server" probeResult="failure" output=< Dec 03 23:05:53 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 23:05:53 crc kubenswrapper[4830]: > Dec 03 23:05:54 crc kubenswrapper[4830]: I1203 23:05:54.810174 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8dnq8_603c63d8-e5d2-428b-925d-aab17f1889dc/nmstate-console-plugin/0.log" Dec 03 23:05:54 crc kubenswrapper[4830]: I1203 23:05:54.985665 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s66zt_47b33d94-20d4-4640-acb2-c25aa2903bd1/nmstate-handler/0.log" Dec 03 23:05:55 crc kubenswrapper[4830]: I1203 23:05:55.034065 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-krngn_22fd491b-eed1-4558-86bc-ae1f601fcdd0/kube-rbac-proxy/0.log" Dec 03 23:05:55 crc kubenswrapper[4830]: I1203 23:05:55.068269 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-krngn_22fd491b-eed1-4558-86bc-ae1f601fcdd0/nmstate-metrics/0.log" Dec 03 23:05:55 crc kubenswrapper[4830]: I1203 23:05:55.224876 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-pncxz_de8f51ca-5084-4af1-87dc-715b869006d0/nmstate-operator/0.log" Dec 03 23:05:55 crc kubenswrapper[4830]: I1203 23:05:55.325523 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8pqtc_a8b2e328-2397-45e1-a593-5f5094799015/nmstate-webhook/0.log" Dec 03 23:06:02 crc kubenswrapper[4830]: I1203 23:06:02.145955 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:06:02 crc kubenswrapper[4830]: I1203 23:06:02.206416 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:06:02 crc kubenswrapper[4830]: I1203 23:06:02.388024 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:06:03 crc kubenswrapper[4830]: I1203 23:06:03.841023 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q8ppp" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="registry-server" containerID="cri-o://e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a" gracePeriod=2 Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.385136 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.594479 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content\") pod \"3aca0487-c506-463a-a62b-218ebaa0d5a5\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.594841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities\") pod \"3aca0487-c506-463a-a62b-218ebaa0d5a5\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.594885 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txdmc\" (UniqueName: \"kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc\") pod \"3aca0487-c506-463a-a62b-218ebaa0d5a5\" (UID: \"3aca0487-c506-463a-a62b-218ebaa0d5a5\") " Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.595370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities" (OuterVolumeSpecName: "utilities") pod "3aca0487-c506-463a-a62b-218ebaa0d5a5" (UID: "3aca0487-c506-463a-a62b-218ebaa0d5a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.602930 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc" (OuterVolumeSpecName: "kube-api-access-txdmc") pod "3aca0487-c506-463a-a62b-218ebaa0d5a5" (UID: "3aca0487-c506-463a-a62b-218ebaa0d5a5"). InnerVolumeSpecName "kube-api-access-txdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.692433 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aca0487-c506-463a-a62b-218ebaa0d5a5" (UID: "3aca0487-c506-463a-a62b-218ebaa0d5a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.697135 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txdmc\" (UniqueName: \"kubernetes.io/projected/3aca0487-c506-463a-a62b-218ebaa0d5a5-kube-api-access-txdmc\") on node \"crc\" DevicePath \"\"" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.697161 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.697171 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aca0487-c506-463a-a62b-218ebaa0d5a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.856978 4830 generic.go:334] "Generic (PLEG): container finished" podID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerID="e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a" exitCode=0 Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.857033 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerDied","Data":"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a"} Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.857064 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8ppp" event={"ID":"3aca0487-c506-463a-a62b-218ebaa0d5a5","Type":"ContainerDied","Data":"341aefa73c77b816d718f5354209a110d2857a3bf4f212d2a8188aa3f630f236"} Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.857082 4830 scope.go:117] "RemoveContainer" containerID="e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.857243 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8ppp" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.893411 4830 scope.go:117] "RemoveContainer" containerID="db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.904144 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.918641 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q8ppp"] Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.928486 4830 scope.go:117] "RemoveContainer" containerID="019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.986607 4830 scope.go:117] "RemoveContainer" containerID="e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a" Dec 03 23:06:04 crc kubenswrapper[4830]: E1203 23:06:04.987436 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a\": container with ID starting with e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a not found: ID does not exist" containerID="e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.987488 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a"} err="failed to get container status \"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a\": rpc error: code = NotFound desc = could not find container \"e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a\": container with ID starting with e83b1afe2cd44178d65a7abb8b14a4a42cd1e0b3e3cd3475fc33b7414fcd319a not found: ID does not exist" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.987535 4830 scope.go:117] "RemoveContainer" containerID="db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f" Dec 03 23:06:04 crc kubenswrapper[4830]: E1203 23:06:04.987955 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f\": container with ID starting with db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f not found: ID does not exist" containerID="db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.987983 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f"} err="failed to get container status \"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f\": rpc error: code = NotFound desc = could not find container \"db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f\": container with ID starting with db27dbe31fec5cff6c79d5a3741adf7ba565e036cbc6736e06a1e38b7019d59f not found: ID does not exist" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.987997 4830 scope.go:117] "RemoveContainer" containerID="019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41" Dec 03 23:06:04 crc kubenswrapper[4830]: E1203 23:06:04.988258 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41\": container with ID starting with 019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41 not found: ID does not exist" containerID="019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41" Dec 03 23:06:04 crc kubenswrapper[4830]: I1203 23:06:04.988294 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41"} err="failed to get container status \"019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41\": rpc error: code = NotFound desc = could not find container \"019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41\": container with ID starting with 019c2fc4569b93796e5bb76fd911f2d1ba7ab0b79930126aa079e20141e30a41 not found: ID does not exist" Dec 03 23:06:05 crc kubenswrapper[4830]: I1203 23:06:05.357967 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" path="/var/lib/kubelet/pods/3aca0487-c506-463a-a62b-218ebaa0d5a5/volumes" Dec 03 23:06:09 crc kubenswrapper[4830]: I1203 23:06:09.007633 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/kube-rbac-proxy/0.log" Dec 03 23:06:09 crc kubenswrapper[4830]: I1203 23:06:09.109156 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/manager/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.479991 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cc4b2_36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d/kube-rbac-proxy/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.565291 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cc4b2_36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d/controller/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.665713 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.865728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.872622 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.879433 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:06:23 crc kubenswrapper[4830]: I1203 23:06:23.910136 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.055658 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.081379 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.093184 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.123847 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.337878 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/controller/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.339781 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.339844 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.339781 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.519337 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/frr-metrics/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.521762 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/kube-rbac-proxy-frr/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.545600 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/kube-rbac-proxy/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.802297 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/reloader/0.log" Dec 03 23:06:24 crc kubenswrapper[4830]: I1203 23:06:24.912480 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-dj5pn_ea33811b-aa63-48d4-b9f3-9446ca919e3c/frr-k8s-webhook-server/0.log" Dec 03 23:06:25 crc kubenswrapper[4830]: I1203 23:06:25.073133 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6546545fd8-lr9kg_3898c0c7-fe9f-4446-803f-01d5c019b406/manager/0.log" Dec 03 23:06:25 crc kubenswrapper[4830]: I1203 23:06:25.317728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79d7c9f765-w2ltt_a308b4ff-4a2a-4ab4-9171-9dc572b71c29/webhook-server/0.log" Dec 03 23:06:25 crc kubenswrapper[4830]: I1203 23:06:25.425606 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q9rp_194dbe3b-afec-4576-b9ac-8810ad9c9482/kube-rbac-proxy/0.log" Dec 03 23:06:25 crc kubenswrapper[4830]: I1203 23:06:25.998252 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/frr/0.log" Dec 03 23:06:26 crc kubenswrapper[4830]: I1203 23:06:26.018572 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q9rp_194dbe3b-afec-4576-b9ac-8810ad9c9482/speaker/0.log" Dec 03 23:06:37 crc kubenswrapper[4830]: I1203 23:06:37.581592 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:06:37 crc kubenswrapper[4830]: I1203 23:06:37.763167 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:06:37 crc kubenswrapper[4830]: I1203 23:06:37.837028 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:06:37 crc kubenswrapper[4830]: I1203 23:06:37.860879 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.057310 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.060017 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.109851 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/extract/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.273487 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.451110 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.458309 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.497067 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.651773 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.654594 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.687551 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/extract/0.log" Dec 03 23:06:38 crc kubenswrapper[4830]: I1203 23:06:38.834369 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.047061 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.065663 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.081722 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.287273 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.365233 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/extract/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.426557 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.512065 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.715033 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.728223 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.761343 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.965374 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.973829 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:06:39 crc kubenswrapper[4830]: I1203 23:06:39.999939 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/extract/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.147042 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.384016 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.405545 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.473531 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.614349 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/extract/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.614788 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.660349 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:06:40 crc kubenswrapper[4830]: I1203 23:06:40.825716 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.001119 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.009887 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.057890 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.205306 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.262216 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.439390 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-utilities/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.719384 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.743480 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-utilities/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.778023 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/registry-server/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.821719 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.887869 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-content/0.log" Dec 03 23:06:41 crc kubenswrapper[4830]: I1203 23:06:41.910025 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.007634 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7s8ph_96069a0e-4ce1-4f68-835c-0a0110f36b2c/marketplace-operator/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.119019 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.350778 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.350786 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.399673 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.603571 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.618479 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.639055 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tkbfv_a87df9c3-5372-4399-877f-b132ae27b408/registry-server/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.811724 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/registry-server/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.834319 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.971861 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:06:42 crc kubenswrapper[4830]: I1203 23:06:42.998431 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:06:43 crc kubenswrapper[4830]: I1203 23:06:43.001937 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:06:43 crc kubenswrapper[4830]: I1203 23:06:43.140368 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:06:43 crc kubenswrapper[4830]: I1203 23:06:43.187190 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:06:43 crc kubenswrapper[4830]: I1203 23:06:43.807852 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/registry-server/0.log" Dec 03 23:06:55 crc kubenswrapper[4830]: I1203 23:06:55.199777 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh_eb61183f-1e00-4056-9cf6-d1503c208d29/prometheus-operator/0.log" Dec 03 23:06:55 crc kubenswrapper[4830]: I1203 23:06:55.328910 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad/prometheus-operator-admission-webhook/0.log" Dec 03 23:06:55 crc kubenswrapper[4830]: I1203 23:06:55.390558 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_35bdd835-3ab4-4828-bd70-6d3f0df5131f/prometheus-operator-admission-webhook/0.log" Dec 03 23:06:55 crc kubenswrapper[4830]: I1203 23:06:55.521867 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-49j7v_a167735b-f973-4627-b731-0d4ab1458916/operator/0.log" Dec 03 23:06:55 crc kubenswrapper[4830]: I1203 23:06:55.630435 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-zjhqs_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85/perses-operator/0.log" Dec 03 23:07:07 crc kubenswrapper[4830]: I1203 23:07:07.256903 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/kube-rbac-proxy/0.log" Dec 03 23:07:07 crc kubenswrapper[4830]: I1203 23:07:07.321527 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/manager/0.log" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.008769 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:20 crc kubenswrapper[4830]: E1203 23:07:20.009690 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="extract-content" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.009701 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="extract-content" Dec 03 23:07:20 crc kubenswrapper[4830]: E1203 23:07:20.009711 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="registry-server" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.009717 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="registry-server" Dec 03 23:07:20 crc kubenswrapper[4830]: E1203 23:07:20.009746 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="extract-utilities" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.009752 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="extract-utilities" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.009935 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aca0487-c506-463a-a62b-218ebaa0d5a5" containerName="registry-server" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.011431 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.075147 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.087782 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bgl\" (UniqueName: \"kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.087868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.087999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.189386 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bgl\" (UniqueName: \"kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.189518 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.189559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.190114 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.191256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.213342 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bgl\" (UniqueName: \"kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl\") pod \"redhat-marketplace-dhspn\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.334553 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:20 crc kubenswrapper[4830]: I1203 23:07:20.796574 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:21 crc kubenswrapper[4830]: I1203 23:07:21.759030 4830 generic.go:334] "Generic (PLEG): container finished" podID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerID="8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91" exitCode=0 Dec 03 23:07:21 crc kubenswrapper[4830]: I1203 23:07:21.759081 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerDied","Data":"8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91"} Dec 03 23:07:21 crc kubenswrapper[4830]: I1203 23:07:21.759315 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerStarted","Data":"8e791124305dfd2ae58889c2baa8007c4a6277b9191387b89f313bac4aac3188"} Dec 03 23:07:21 crc kubenswrapper[4830]: I1203 23:07:21.761309 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:07:22 crc kubenswrapper[4830]: I1203 23:07:22.783127 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerStarted","Data":"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17"} Dec 03 23:07:23 crc kubenswrapper[4830]: I1203 23:07:23.794546 4830 generic.go:334] "Generic (PLEG): container finished" podID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerID="88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17" exitCode=0 Dec 03 23:07:23 crc kubenswrapper[4830]: I1203 23:07:23.794621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerDied","Data":"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17"} Dec 03 23:07:24 crc kubenswrapper[4830]: I1203 23:07:24.808439 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerStarted","Data":"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478"} Dec 03 23:07:24 crc kubenswrapper[4830]: I1203 23:07:24.842681 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhspn" podStartSLOduration=3.34620101 podStartE2EDuration="5.842658794s" podCreationTimestamp="2025-12-03 23:07:19 +0000 UTC" firstStartedPulling="2025-12-03 23:07:21.760912337 +0000 UTC m=+3730.757373686" lastFinishedPulling="2025-12-03 23:07:24.257370121 +0000 UTC m=+3733.253831470" observedRunningTime="2025-12-03 23:07:24.831496716 +0000 UTC m=+3733.827958065" watchObservedRunningTime="2025-12-03 23:07:24.842658794 +0000 UTC m=+3733.839120143" Dec 03 23:07:30 crc kubenswrapper[4830]: I1203 23:07:30.335309 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:30 crc kubenswrapper[4830]: I1203 23:07:30.335832 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:30 crc kubenswrapper[4830]: I1203 23:07:30.391826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:30 crc kubenswrapper[4830]: I1203 23:07:30.931140 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:30 crc kubenswrapper[4830]: I1203 23:07:30.987013 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:32 crc kubenswrapper[4830]: I1203 23:07:32.893731 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhspn" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="registry-server" containerID="cri-o://5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478" gracePeriod=2 Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.537595 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.676802 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities\") pod \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.676978 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content\") pod \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.677037 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bgl\" (UniqueName: \"kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl\") pod \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\" (UID: \"adb898af-d5d6-4f66-b986-0e2d0d3b7588\") " Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.678283 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities" (OuterVolumeSpecName: "utilities") pod "adb898af-d5d6-4f66-b986-0e2d0d3b7588" (UID: "adb898af-d5d6-4f66-b986-0e2d0d3b7588"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.683010 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl" (OuterVolumeSpecName: "kube-api-access-t8bgl") pod "adb898af-d5d6-4f66-b986-0e2d0d3b7588" (UID: "adb898af-d5d6-4f66-b986-0e2d0d3b7588"). InnerVolumeSpecName "kube-api-access-t8bgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.701845 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adb898af-d5d6-4f66-b986-0e2d0d3b7588" (UID: "adb898af-d5d6-4f66-b986-0e2d0d3b7588"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.779316 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.780048 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bgl\" (UniqueName: \"kubernetes.io/projected/adb898af-d5d6-4f66-b986-0e2d0d3b7588-kube-api-access-t8bgl\") on node \"crc\" DevicePath \"\"" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.780067 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb898af-d5d6-4f66-b986-0e2d0d3b7588-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.908643 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhspn" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.908764 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerDied","Data":"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478"} Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.908811 4830 scope.go:117] "RemoveContainer" containerID="5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.908436 4830 generic.go:334] "Generic (PLEG): container finished" podID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerID="5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478" exitCode=0 Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.912869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhspn" event={"ID":"adb898af-d5d6-4f66-b986-0e2d0d3b7588","Type":"ContainerDied","Data":"8e791124305dfd2ae58889c2baa8007c4a6277b9191387b89f313bac4aac3188"} Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.944578 4830 scope.go:117] "RemoveContainer" containerID="88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.949064 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.974823 4830 scope.go:117] "RemoveContainer" containerID="8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91" Dec 03 23:07:33 crc kubenswrapper[4830]: I1203 23:07:33.975431 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhspn"] Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.017760 4830 scope.go:117] "RemoveContainer" containerID="5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478" Dec 03 23:07:34 crc kubenswrapper[4830]: E1203 23:07:34.018496 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478\": container with ID starting with 5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478 not found: ID does not exist" containerID="5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478" Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.018560 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478"} err="failed to get container status \"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478\": rpc error: code = NotFound desc = could not find container \"5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478\": container with ID starting with 5ba43859be1264ba553ca8f3da6ce3aebb3367f915dd29d560210401183e9478 not found: ID does not exist" Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.018589 4830 scope.go:117] "RemoveContainer" containerID="88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17" Dec 03 23:07:34 crc kubenswrapper[4830]: E1203 23:07:34.018989 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17\": container with ID starting with 88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17 not found: ID does not exist" containerID="88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17" Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.019019 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17"} err="failed to get container status \"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17\": rpc error: code = NotFound desc = could not find container \"88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17\": container with ID starting with 88c96aa7e7a2f70f9659821a99f0a22dccecc9cd993be12c5e5c31595dde6a17 not found: ID does not exist" Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.019040 4830 scope.go:117] "RemoveContainer" containerID="8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91" Dec 03 23:07:34 crc kubenswrapper[4830]: E1203 23:07:34.019529 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91\": container with ID starting with 8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91 not found: ID does not exist" containerID="8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91" Dec 03 23:07:34 crc kubenswrapper[4830]: I1203 23:07:34.019548 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91"} err="failed to get container status \"8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91\": rpc error: code = NotFound desc = could not find container \"8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91\": container with ID starting with 8e5bc78f71026315b4310f1e0e5f154b4a3a5c088f40bc707cf71fcf0c62cf91 not found: ID does not exist" Dec 03 23:07:35 crc kubenswrapper[4830]: I1203 23:07:35.351149 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" path="/var/lib/kubelet/pods/adb898af-d5d6-4f66-b986-0e2d0d3b7588/volumes" Dec 03 23:07:56 crc kubenswrapper[4830]: I1203 23:07:56.681786 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:07:56 crc kubenswrapper[4830]: I1203 23:07:56.682447 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:08:26 crc kubenswrapper[4830]: I1203 23:08:26.681578 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:08:26 crc kubenswrapper[4830]: I1203 23:08:26.682121 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:08:47 crc kubenswrapper[4830]: I1203 23:08:47.668528 4830 generic.go:334] "Generic (PLEG): container finished" podID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerID="3aa8c058bce52985c8a3aa76ae00ad7bf01c01880723d480d647b624cc6bcbdf" exitCode=0 Dec 03 23:08:47 crc kubenswrapper[4830]: I1203 23:08:47.668700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9twb/must-gather-4gns5" event={"ID":"0087c4c2-365d-4f9f-b93a-165fb0dd62cb","Type":"ContainerDied","Data":"3aa8c058bce52985c8a3aa76ae00ad7bf01c01880723d480d647b624cc6bcbdf"} Dec 03 23:08:47 crc kubenswrapper[4830]: I1203 23:08:47.669671 4830 scope.go:117] "RemoveContainer" containerID="3aa8c058bce52985c8a3aa76ae00ad7bf01c01880723d480d647b624cc6bcbdf" Dec 03 23:08:48 crc kubenswrapper[4830]: I1203 23:08:48.671149 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9twb_must-gather-4gns5_0087c4c2-365d-4f9f-b93a-165fb0dd62cb/gather/0.log" Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.477005 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9twb/must-gather-4gns5"] Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.477764 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b9twb/must-gather-4gns5" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="copy" containerID="cri-o://b78f0457877670c6d82176f97b64c58f3f5db8fe7642b0509ef00f14c5828006" gracePeriod=2 Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.491403 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9twb/must-gather-4gns5"] Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.681846 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.681912 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.681959 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.682852 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.682910 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" gracePeriod=600 Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.821006 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9twb_must-gather-4gns5_0087c4c2-365d-4f9f-b93a-165fb0dd62cb/copy/0.log" Dec 03 23:08:56 crc kubenswrapper[4830]: I1203 23:08:56.822579 4830 generic.go:334] "Generic (PLEG): container finished" podID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerID="b78f0457877670c6d82176f97b64c58f3f5db8fe7642b0509ef00f14c5828006" exitCode=143 Dec 03 23:08:56 crc kubenswrapper[4830]: E1203 23:08:56.823133 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.053311 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9twb_must-gather-4gns5_0087c4c2-365d-4f9f-b93a-165fb0dd62cb/copy/0.log" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.055111 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.212705 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output\") pod \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.212917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmkq\" (UniqueName: \"kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq\") pod \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\" (UID: \"0087c4c2-365d-4f9f-b93a-165fb0dd62cb\") " Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.228722 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq" (OuterVolumeSpecName: "kube-api-access-nsmkq") pod "0087c4c2-365d-4f9f-b93a-165fb0dd62cb" (UID: "0087c4c2-365d-4f9f-b93a-165fb0dd62cb"). InnerVolumeSpecName "kube-api-access-nsmkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.315525 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsmkq\" (UniqueName: \"kubernetes.io/projected/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-kube-api-access-nsmkq\") on node \"crc\" DevicePath \"\"" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.388400 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0087c4c2-365d-4f9f-b93a-165fb0dd62cb" (UID: "0087c4c2-365d-4f9f-b93a-165fb0dd62cb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.417297 4830 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0087c4c2-365d-4f9f-b93a-165fb0dd62cb-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.836137 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9twb_must-gather-4gns5_0087c4c2-365d-4f9f-b93a-165fb0dd62cb/copy/0.log" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.837022 4830 scope.go:117] "RemoveContainer" containerID="b78f0457877670c6d82176f97b64c58f3f5db8fe7642b0509ef00f14c5828006" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.837033 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9twb/must-gather-4gns5" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.839272 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" exitCode=0 Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.839312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a"} Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.840027 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:08:57 crc kubenswrapper[4830]: E1203 23:08:57.840281 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.857230 4830 scope.go:117] "RemoveContainer" containerID="3aa8c058bce52985c8a3aa76ae00ad7bf01c01880723d480d647b624cc6bcbdf" Dec 03 23:08:57 crc kubenswrapper[4830]: I1203 23:08:57.974468 4830 scope.go:117] "RemoveContainer" containerID="f09a70d7650c20489f7d7cca35808804f160b2ea0e82eb2544a3ff9061e7435c" Dec 03 23:08:59 crc kubenswrapper[4830]: I1203 23:08:59.348710 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" path="/var/lib/kubelet/pods/0087c4c2-365d-4f9f-b93a-165fb0dd62cb/volumes" Dec 03 23:09:10 crc kubenswrapper[4830]: I1203 23:09:10.338560 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:09:10 crc kubenswrapper[4830]: E1203 23:09:10.339811 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:09:22 crc kubenswrapper[4830]: I1203 23:09:22.337641 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:09:22 crc kubenswrapper[4830]: E1203 23:09:22.338400 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:09:29 crc kubenswrapper[4830]: I1203 23:09:29.781426 4830 scope.go:117] "RemoveContainer" containerID="271c209b032d93e181642224c4882f9b730f0525c5f2cc410da960b15e82e72c" Dec 03 23:09:29 crc kubenswrapper[4830]: I1203 23:09:29.815636 4830 scope.go:117] "RemoveContainer" containerID="f338e44940e3224aa21c39d5e93e7755c7b0a0eb2e3e319e6b36dffcdd966edc" Dec 03 23:09:29 crc kubenswrapper[4830]: I1203 23:09:29.875137 4830 scope.go:117] "RemoveContainer" containerID="f577558fb517bfc0105cf067798df950a0d8f30597d1b8a84c94ae7039d83960" Dec 03 23:09:29 crc kubenswrapper[4830]: I1203 23:09:29.908782 4830 scope.go:117] "RemoveContainer" containerID="bdd56fcf9d3a87b57f422e916c5d4f8830cb0b8d65b6ec4bf39d83c7ac4e1259" Dec 03 23:09:35 crc kubenswrapper[4830]: I1203 23:09:35.337819 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:09:35 crc kubenswrapper[4830]: E1203 23:09:35.338685 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:09:46 crc kubenswrapper[4830]: I1203 23:09:46.338043 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:09:46 crc kubenswrapper[4830]: E1203 23:09:46.339040 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:10:01 crc kubenswrapper[4830]: I1203 23:10:01.351698 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:10:01 crc kubenswrapper[4830]: E1203 23:10:01.352414 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:10:16 crc kubenswrapper[4830]: I1203 23:10:16.337929 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:10:16 crc kubenswrapper[4830]: E1203 23:10:16.339119 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:10:30 crc kubenswrapper[4830]: I1203 23:10:29.999836 4830 scope.go:117] "RemoveContainer" containerID="5899b57330aad54a122cfb7a6be2e4bca8c5ded4c49b7022e0e593dfbd467db7" Dec 03 23:10:31 crc kubenswrapper[4830]: I1203 23:10:31.344634 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:10:31 crc kubenswrapper[4830]: E1203 23:10:31.345442 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:10:43 crc kubenswrapper[4830]: I1203 23:10:43.337003 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:10:43 crc kubenswrapper[4830]: E1203 23:10:43.337841 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:10:54 crc kubenswrapper[4830]: I1203 23:10:54.336726 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:10:54 crc kubenswrapper[4830]: E1203 23:10:54.337647 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:06 crc kubenswrapper[4830]: I1203 23:11:06.336759 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:11:06 crc kubenswrapper[4830]: E1203 23:11:06.337584 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:20 crc kubenswrapper[4830]: I1203 23:11:20.337031 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:11:20 crc kubenswrapper[4830]: E1203 23:11:20.337802 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:34 crc kubenswrapper[4830]: I1203 23:11:34.337786 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:11:34 crc kubenswrapper[4830]: E1203 23:11:34.338372 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:47 crc kubenswrapper[4830]: I1203 23:11:47.337939 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:11:47 crc kubenswrapper[4830]: E1203 23:11:47.338837 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.443251 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjgst/must-gather-n2ml9"] Dec 03 23:11:56 crc kubenswrapper[4830]: E1203 23:11:56.444306 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="gather" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444323 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="gather" Dec 03 23:11:56 crc kubenswrapper[4830]: E1203 23:11:56.444386 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="copy" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444394 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="copy" Dec 03 23:11:56 crc kubenswrapper[4830]: E1203 23:11:56.444412 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="extract-utilities" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444420 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="extract-utilities" Dec 03 23:11:56 crc kubenswrapper[4830]: E1203 23:11:56.444439 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="extract-content" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444447 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="extract-content" Dec 03 23:11:56 crc kubenswrapper[4830]: E1203 23:11:56.444458 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="registry-server" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444465 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="registry-server" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444815 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="gather" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444847 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb898af-d5d6-4f66-b986-0e2d0d3b7588" containerName="registry-server" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.444862 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0087c4c2-365d-4f9f-b93a-165fb0dd62cb" containerName="copy" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.446357 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.448522 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bjgst"/"kube-root-ca.crt" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.448708 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bjgst"/"openshift-service-ca.crt" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.448836 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bjgst"/"default-dockercfg-59t7k" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.454273 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bjgst/must-gather-n2ml9"] Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.588119 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dfn\" (UniqueName: \"kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.588208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.691677 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.692005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dfn\" (UniqueName: \"kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.692268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.726707 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dfn\" (UniqueName: \"kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn\") pod \"must-gather-n2ml9\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:56 crc kubenswrapper[4830]: I1203 23:11:56.771189 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:11:57 crc kubenswrapper[4830]: I1203 23:11:57.252172 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bjgst/must-gather-n2ml9"] Dec 03 23:11:57 crc kubenswrapper[4830]: I1203 23:11:57.630971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/must-gather-n2ml9" event={"ID":"490687ea-7427-474e-9684-0894513faa07","Type":"ContainerStarted","Data":"174fb4b763bb4f163bc85a04294013f98177580ffc902eddc561650221ac25ca"} Dec 03 23:11:57 crc kubenswrapper[4830]: I1203 23:11:57.631027 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/must-gather-n2ml9" event={"ID":"490687ea-7427-474e-9684-0894513faa07","Type":"ContainerStarted","Data":"41c2a390f90d0d4fd5621e1283c527fc861d0f0ec5700cc32557e0e2d3099301"} Dec 03 23:11:58 crc kubenswrapper[4830]: I1203 23:11:58.337492 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:11:58 crc kubenswrapper[4830]: E1203 23:11:58.337793 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:11:58 crc kubenswrapper[4830]: I1203 23:11:58.640614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/must-gather-n2ml9" event={"ID":"490687ea-7427-474e-9684-0894513faa07","Type":"ContainerStarted","Data":"bd3fa853516a4ed5b396353cb681f5404865f957aebbb08a5ee849fe31ba19a3"} Dec 03 23:11:58 crc kubenswrapper[4830]: I1203 23:11:58.663297 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bjgst/must-gather-n2ml9" podStartSLOduration=2.663278321 podStartE2EDuration="2.663278321s" podCreationTimestamp="2025-12-03 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:11:58.654521419 +0000 UTC m=+4007.650982768" watchObservedRunningTime="2025-12-03 23:11:58.663278321 +0000 UTC m=+4007.659739670" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.374021 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjgst/crc-debug-m7ph5"] Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.375896 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.413164 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.413682 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7dgx\" (UniqueName: \"kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.515759 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.515929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.516408 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7dgx\" (UniqueName: \"kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.546796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7dgx\" (UniqueName: \"kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx\") pod \"crc-debug-m7ph5\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: I1203 23:12:01.704055 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:01 crc kubenswrapper[4830]: W1203 23:12:01.734062 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1513c642_f4af_4275_9104_c2f1251b4970.slice/crio-09f06eb1a762e83e40194b13e1d09eadde175d6853f13db08ce00dc73806e9c2 WatchSource:0}: Error finding container 09f06eb1a762e83e40194b13e1d09eadde175d6853f13db08ce00dc73806e9c2: Status 404 returned error can't find the container with id 09f06eb1a762e83e40194b13e1d09eadde175d6853f13db08ce00dc73806e9c2 Dec 03 23:12:02 crc kubenswrapper[4830]: I1203 23:12:02.676439 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" event={"ID":"1513c642-f4af-4275-9104-c2f1251b4970","Type":"ContainerStarted","Data":"a07e4fb7cf5dd7460b2cca29e061b6d86c228f15ea19f1242c1f0745c3b50621"} Dec 03 23:12:02 crc kubenswrapper[4830]: I1203 23:12:02.676851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" event={"ID":"1513c642-f4af-4275-9104-c2f1251b4970","Type":"ContainerStarted","Data":"09f06eb1a762e83e40194b13e1d09eadde175d6853f13db08ce00dc73806e9c2"} Dec 03 23:12:02 crc kubenswrapper[4830]: I1203 23:12:02.694417 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" podStartSLOduration=1.694395832 podStartE2EDuration="1.694395832s" podCreationTimestamp="2025-12-03 23:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:12:02.691293735 +0000 UTC m=+4011.687755084" watchObservedRunningTime="2025-12-03 23:12:02.694395832 +0000 UTC m=+4011.690857181" Dec 03 23:12:03 crc kubenswrapper[4830]: I1203 23:12:03.037670 4830 trace.go:236] Trace[1696905069]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (03-Dec-2025 23:12:02.020) (total time: 1017ms): Dec 03 23:12:03 crc kubenswrapper[4830]: Trace[1696905069]: [1.017055668s] [1.017055668s] END Dec 03 23:12:13 crc kubenswrapper[4830]: I1203 23:12:13.337104 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:12:13 crc kubenswrapper[4830]: E1203 23:12:13.337862 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:12:25 crc kubenswrapper[4830]: I1203 23:12:25.336755 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:12:25 crc kubenswrapper[4830]: E1203 23:12:25.337539 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:12:39 crc kubenswrapper[4830]: I1203 23:12:39.339183 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:12:39 crc kubenswrapper[4830]: E1203 23:12:39.340647 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:12:42 crc kubenswrapper[4830]: I1203 23:12:42.037033 4830 generic.go:334] "Generic (PLEG): container finished" podID="1513c642-f4af-4275-9104-c2f1251b4970" containerID="a07e4fb7cf5dd7460b2cca29e061b6d86c228f15ea19f1242c1f0745c3b50621" exitCode=0 Dec 03 23:12:42 crc kubenswrapper[4830]: I1203 23:12:42.037254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" event={"ID":"1513c642-f4af-4275-9104-c2f1251b4970","Type":"ContainerDied","Data":"a07e4fb7cf5dd7460b2cca29e061b6d86c228f15ea19f1242c1f0745c3b50621"} Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.175753 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.210610 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-m7ph5"] Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.221015 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-m7ph5"] Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.354875 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host\") pod \"1513c642-f4af-4275-9104-c2f1251b4970\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.354998 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host" (OuterVolumeSpecName: "host") pod "1513c642-f4af-4275-9104-c2f1251b4970" (UID: "1513c642-f4af-4275-9104-c2f1251b4970"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.355187 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7dgx\" (UniqueName: \"kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx\") pod \"1513c642-f4af-4275-9104-c2f1251b4970\" (UID: \"1513c642-f4af-4275-9104-c2f1251b4970\") " Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.355913 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1513c642-f4af-4275-9104-c2f1251b4970-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.361692 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx" (OuterVolumeSpecName: "kube-api-access-l7dgx") pod "1513c642-f4af-4275-9104-c2f1251b4970" (UID: "1513c642-f4af-4275-9104-c2f1251b4970"). InnerVolumeSpecName "kube-api-access-l7dgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.487258 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7dgx\" (UniqueName: \"kubernetes.io/projected/1513c642-f4af-4275-9104-c2f1251b4970-kube-api-access-l7dgx\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.682415 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:43 crc kubenswrapper[4830]: E1203 23:12:43.684501 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1513c642-f4af-4275-9104-c2f1251b4970" containerName="container-00" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.684539 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1513c642-f4af-4275-9104-c2f1251b4970" containerName="container-00" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.685914 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1513c642-f4af-4275-9104-c2f1251b4970" containerName="container-00" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.688856 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.691460 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.691661 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw4lh\" (UniqueName: \"kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.691684 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.699359 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.794101 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw4lh\" (UniqueName: \"kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.794185 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.794270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.794973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.795111 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:43 crc kubenswrapper[4830]: I1203 23:12:43.815498 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw4lh\" (UniqueName: \"kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh\") pod \"certified-operators-gdh5d\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.021894 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.093832 4830 scope.go:117] "RemoveContainer" containerID="a07e4fb7cf5dd7460b2cca29e061b6d86c228f15ea19f1242c1f0745c3b50621" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.094090 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-m7ph5" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.594388 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjgst/crc-debug-dqrd4"] Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.608898 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.634676 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.720203 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.720346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnprl\" (UniqueName: \"kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.821778 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.822329 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnprl\" (UniqueName: \"kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.822053 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.852897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnprl\" (UniqueName: \"kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl\") pod \"crc-debug-dqrd4\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: I1203 23:12:44.928154 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:44 crc kubenswrapper[4830]: W1203 23:12:44.956116 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae4f186c_73a5_4c8f_8b30_b739f2547ff3.slice/crio-4617aa02583f0fc9c55579ef378df79952004ee37f7f2fc03544bdf29c3e6060 WatchSource:0}: Error finding container 4617aa02583f0fc9c55579ef378df79952004ee37f7f2fc03544bdf29c3e6060: Status 404 returned error can't find the container with id 4617aa02583f0fc9c55579ef378df79952004ee37f7f2fc03544bdf29c3e6060 Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.103370 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" event={"ID":"ae4f186c-73a5-4c8f-8b30-b739f2547ff3","Type":"ContainerStarted","Data":"4617aa02583f0fc9c55579ef378df79952004ee37f7f2fc03544bdf29c3e6060"} Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.106185 4830 generic.go:334] "Generic (PLEG): container finished" podID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerID="3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8" exitCode=0 Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.106279 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerDied","Data":"3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8"} Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.106347 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerStarted","Data":"132c6c948305a7d0ff64e7ffd6d7c5fa68f9433a4585af985c23427b0dfbdd72"} Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.108487 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:12:45 crc kubenswrapper[4830]: I1203 23:12:45.351990 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1513c642-f4af-4275-9104-c2f1251b4970" path="/var/lib/kubelet/pods/1513c642-f4af-4275-9104-c2f1251b4970/volumes" Dec 03 23:12:46 crc kubenswrapper[4830]: I1203 23:12:46.119193 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae4f186c-73a5-4c8f-8b30-b739f2547ff3" containerID="3d164503b337175263c1352f97c135359e3040c2a1af340f24a947a0c3fe821b" exitCode=0 Dec 03 23:12:46 crc kubenswrapper[4830]: I1203 23:12:46.119390 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" event={"ID":"ae4f186c-73a5-4c8f-8b30-b739f2547ff3","Type":"ContainerDied","Data":"3d164503b337175263c1352f97c135359e3040c2a1af340f24a947a0c3fe821b"} Dec 03 23:12:46 crc kubenswrapper[4830]: I1203 23:12:46.121494 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerStarted","Data":"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc"} Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.133190 4830 generic.go:334] "Generic (PLEG): container finished" podID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerID="9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc" exitCode=0 Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.134751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerDied","Data":"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc"} Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.242688 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-dqrd4"] Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.252133 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-dqrd4"] Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.259876 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.378374 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host\") pod \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.378497 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnprl\" (UniqueName: \"kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl\") pod \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\" (UID: \"ae4f186c-73a5-4c8f-8b30-b739f2547ff3\") " Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.378737 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host" (OuterVolumeSpecName: "host") pod "ae4f186c-73a5-4c8f-8b30-b739f2547ff3" (UID: "ae4f186c-73a5-4c8f-8b30-b739f2547ff3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.379020 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.385902 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl" (OuterVolumeSpecName: "kube-api-access-lnprl") pod "ae4f186c-73a5-4c8f-8b30-b739f2547ff3" (UID: "ae4f186c-73a5-4c8f-8b30-b739f2547ff3"). InnerVolumeSpecName "kube-api-access-lnprl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:12:47 crc kubenswrapper[4830]: I1203 23:12:47.481469 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnprl\" (UniqueName: \"kubernetes.io/projected/ae4f186c-73a5-4c8f-8b30-b739f2547ff3-kube-api-access-lnprl\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.147689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerStarted","Data":"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263"} Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.150580 4830 scope.go:117] "RemoveContainer" containerID="3d164503b337175263c1352f97c135359e3040c2a1af340f24a947a0c3fe821b" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.150607 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-dqrd4" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.179432 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdh5d" podStartSLOduration=2.765868869 podStartE2EDuration="5.17941455s" podCreationTimestamp="2025-12-03 23:12:43 +0000 UTC" firstStartedPulling="2025-12-03 23:12:45.108232695 +0000 UTC m=+4054.104694044" lastFinishedPulling="2025-12-03 23:12:47.521778376 +0000 UTC m=+4056.518239725" observedRunningTime="2025-12-03 23:12:48.178275689 +0000 UTC m=+4057.174737048" watchObservedRunningTime="2025-12-03 23:12:48.17941455 +0000 UTC m=+4057.175875899" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.566450 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjgst/crc-debug-9pwdc"] Dec 03 23:12:48 crc kubenswrapper[4830]: E1203 23:12:48.566932 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4f186c-73a5-4c8f-8b30-b739f2547ff3" containerName="container-00" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.566952 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4f186c-73a5-4c8f-8b30-b739f2547ff3" containerName="container-00" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.567178 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4f186c-73a5-4c8f-8b30-b739f2547ff3" containerName="container-00" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.568500 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.705040 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.705392 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmzkx\" (UniqueName: \"kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.807083 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.807227 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:48 crc kubenswrapper[4830]: I1203 23:12:48.807676 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmzkx\" (UniqueName: \"kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:49 crc kubenswrapper[4830]: I1203 23:12:49.022197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmzkx\" (UniqueName: \"kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx\") pod \"crc-debug-9pwdc\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:49 crc kubenswrapper[4830]: I1203 23:12:49.186248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:49 crc kubenswrapper[4830]: W1203 23:12:49.213133 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b2c93d_bda2_41cb_9c11_6bba95d52894.slice/crio-71c38fd1e2d4e2e2f72c88d579418afab6b26562b466e4da572ddbed3c80933a WatchSource:0}: Error finding container 71c38fd1e2d4e2e2f72c88d579418afab6b26562b466e4da572ddbed3c80933a: Status 404 returned error can't find the container with id 71c38fd1e2d4e2e2f72c88d579418afab6b26562b466e4da572ddbed3c80933a Dec 03 23:12:49 crc kubenswrapper[4830]: I1203 23:12:49.347905 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4f186c-73a5-4c8f-8b30-b739f2547ff3" path="/var/lib/kubelet/pods/ae4f186c-73a5-4c8f-8b30-b739f2547ff3/volumes" Dec 03 23:12:50 crc kubenswrapper[4830]: I1203 23:12:50.198138 4830 generic.go:334] "Generic (PLEG): container finished" podID="f0b2c93d-bda2-41cb-9c11-6bba95d52894" containerID="f12bd46bd090a608078bc9843991e2a6ee2f427c5850df787e061b21ffe8135e" exitCode=0 Dec 03 23:12:50 crc kubenswrapper[4830]: I1203 23:12:50.198189 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" event={"ID":"f0b2c93d-bda2-41cb-9c11-6bba95d52894","Type":"ContainerDied","Data":"f12bd46bd090a608078bc9843991e2a6ee2f427c5850df787e061b21ffe8135e"} Dec 03 23:12:50 crc kubenswrapper[4830]: I1203 23:12:50.198484 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" event={"ID":"f0b2c93d-bda2-41cb-9c11-6bba95d52894","Type":"ContainerStarted","Data":"71c38fd1e2d4e2e2f72c88d579418afab6b26562b466e4da572ddbed3c80933a"} Dec 03 23:12:50 crc kubenswrapper[4830]: I1203 23:12:50.256155 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-9pwdc"] Dec 03 23:12:50 crc kubenswrapper[4830]: I1203 23:12:50.271236 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjgst/crc-debug-9pwdc"] Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.324467 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.468444 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmzkx\" (UniqueName: \"kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx\") pod \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.468844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host\") pod \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\" (UID: \"f0b2c93d-bda2-41cb-9c11-6bba95d52894\") " Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.470104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host" (OuterVolumeSpecName: "host") pod "f0b2c93d-bda2-41cb-9c11-6bba95d52894" (UID: "f0b2c93d-bda2-41cb-9c11-6bba95d52894"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.496818 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx" (OuterVolumeSpecName: "kube-api-access-vmzkx") pod "f0b2c93d-bda2-41cb-9c11-6bba95d52894" (UID: "f0b2c93d-bda2-41cb-9c11-6bba95d52894"). InnerVolumeSpecName "kube-api-access-vmzkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.571557 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmzkx\" (UniqueName: \"kubernetes.io/projected/f0b2c93d-bda2-41cb-9c11-6bba95d52894-kube-api-access-vmzkx\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:51 crc kubenswrapper[4830]: I1203 23:12:51.571593 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b2c93d-bda2-41cb-9c11-6bba95d52894-host\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:52 crc kubenswrapper[4830]: I1203 23:12:52.219562 4830 scope.go:117] "RemoveContainer" containerID="f12bd46bd090a608078bc9843991e2a6ee2f427c5850df787e061b21ffe8135e" Dec 03 23:12:52 crc kubenswrapper[4830]: I1203 23:12:52.219624 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/crc-debug-9pwdc" Dec 03 23:12:53 crc kubenswrapper[4830]: I1203 23:12:53.337344 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:12:53 crc kubenswrapper[4830]: E1203 23:12:53.337981 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:12:53 crc kubenswrapper[4830]: I1203 23:12:53.352911 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b2c93d-bda2-41cb-9c11-6bba95d52894" path="/var/lib/kubelet/pods/f0b2c93d-bda2-41cb-9c11-6bba95d52894/volumes" Dec 03 23:12:54 crc kubenswrapper[4830]: I1203 23:12:54.022154 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:54 crc kubenswrapper[4830]: I1203 23:12:54.022955 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:54 crc kubenswrapper[4830]: I1203 23:12:54.074676 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:54 crc kubenswrapper[4830]: I1203 23:12:54.293930 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:54 crc kubenswrapper[4830]: I1203 23:12:54.345614 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.261160 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gdh5d" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="registry-server" containerID="cri-o://81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263" gracePeriod=2 Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.851316 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.981366 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw4lh\" (UniqueName: \"kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh\") pod \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.981434 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content\") pod \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.981470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities\") pod \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\" (UID: \"52277d4c-7d78-41ab-b2bf-bb6971c563a9\") " Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.982425 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities" (OuterVolumeSpecName: "utilities") pod "52277d4c-7d78-41ab-b2bf-bb6971c563a9" (UID: "52277d4c-7d78-41ab-b2bf-bb6971c563a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:12:56 crc kubenswrapper[4830]: I1203 23:12:56.987769 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh" (OuterVolumeSpecName: "kube-api-access-vw4lh") pod "52277d4c-7d78-41ab-b2bf-bb6971c563a9" (UID: "52277d4c-7d78-41ab-b2bf-bb6971c563a9"). InnerVolumeSpecName "kube-api-access-vw4lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.037065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52277d4c-7d78-41ab-b2bf-bb6971c563a9" (UID: "52277d4c-7d78-41ab-b2bf-bb6971c563a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.084323 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw4lh\" (UniqueName: \"kubernetes.io/projected/52277d4c-7d78-41ab-b2bf-bb6971c563a9-kube-api-access-vw4lh\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.084363 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.084374 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52277d4c-7d78-41ab-b2bf-bb6971c563a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.274377 4830 generic.go:334] "Generic (PLEG): container finished" podID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerID="81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263" exitCode=0 Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.274426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerDied","Data":"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263"} Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.274457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdh5d" event={"ID":"52277d4c-7d78-41ab-b2bf-bb6971c563a9","Type":"ContainerDied","Data":"132c6c948305a7d0ff64e7ffd6d7c5fa68f9433a4585af985c23427b0dfbdd72"} Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.274477 4830 scope.go:117] "RemoveContainer" containerID="81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.276552 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdh5d" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.303485 4830 scope.go:117] "RemoveContainer" containerID="9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.325738 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.352186 4830 scope.go:117] "RemoveContainer" containerID="3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.354349 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gdh5d"] Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.382767 4830 scope.go:117] "RemoveContainer" containerID="81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263" Dec 03 23:12:57 crc kubenswrapper[4830]: E1203 23:12:57.383306 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263\": container with ID starting with 81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263 not found: ID does not exist" containerID="81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.383349 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263"} err="failed to get container status \"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263\": rpc error: code = NotFound desc = could not find container \"81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263\": container with ID starting with 81212d096ca8e5ccb8b64b25cc0d2c1520ddc880df275a291be9394ab11ef263 not found: ID does not exist" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.383380 4830 scope.go:117] "RemoveContainer" containerID="9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc" Dec 03 23:12:57 crc kubenswrapper[4830]: E1203 23:12:57.383860 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc\": container with ID starting with 9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc not found: ID does not exist" containerID="9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.384088 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc"} err="failed to get container status \"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc\": rpc error: code = NotFound desc = could not find container \"9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc\": container with ID starting with 9fbcebfd23038fe579d6ed56045ee6e571a54734a3256a8c9dc7e5784a6835dc not found: ID does not exist" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.384469 4830 scope.go:117] "RemoveContainer" containerID="3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8" Dec 03 23:12:57 crc kubenswrapper[4830]: E1203 23:12:57.384925 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8\": container with ID starting with 3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8 not found: ID does not exist" containerID="3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8" Dec 03 23:12:57 crc kubenswrapper[4830]: I1203 23:12:57.385033 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8"} err="failed to get container status \"3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8\": rpc error: code = NotFound desc = could not find container \"3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8\": container with ID starting with 3ee99b936d970b6b34a7084c9d7104b9697bc99be0251ff295abc6b16c8b74a8 not found: ID does not exist" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.348250 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" path="/var/lib/kubelet/pods/52277d4c-7d78-41ab-b2bf-bb6971c563a9/volumes" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.720563 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ls26g"] Dec 03 23:12:59 crc kubenswrapper[4830]: E1203 23:12:59.721438 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="extract-content" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721462 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="extract-content" Dec 03 23:12:59 crc kubenswrapper[4830]: E1203 23:12:59.721494 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="extract-utilities" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721502 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="extract-utilities" Dec 03 23:12:59 crc kubenswrapper[4830]: E1203 23:12:59.721538 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2c93d-bda2-41cb-9c11-6bba95d52894" containerName="container-00" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721546 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2c93d-bda2-41cb-9c11-6bba95d52894" containerName="container-00" Dec 03 23:12:59 crc kubenswrapper[4830]: E1203 23:12:59.721566 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="registry-server" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721573 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="registry-server" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721816 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="52277d4c-7d78-41ab-b2bf-bb6971c563a9" containerName="registry-server" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.721839 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b2c93d-bda2-41cb-9c11-6bba95d52894" containerName="container-00" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.724045 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.733203 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ls26g"] Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.839569 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-utilities\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.839629 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-catalog-content\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.839669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6jn\" (UniqueName: \"kubernetes.io/projected/1b0cb96d-c126-4872-a661-61efcba529c3-kube-api-access-zm6jn\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.942242 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-utilities\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.942298 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-catalog-content\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.942345 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6jn\" (UniqueName: \"kubernetes.io/projected/1b0cb96d-c126-4872-a661-61efcba529c3-kube-api-access-zm6jn\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.942884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-utilities\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.942912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b0cb96d-c126-4872-a661-61efcba529c3-catalog-content\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:12:59 crc kubenswrapper[4830]: I1203 23:12:59.966976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6jn\" (UniqueName: \"kubernetes.io/projected/1b0cb96d-c126-4872-a661-61efcba529c3-kube-api-access-zm6jn\") pod \"community-operators-ls26g\" (UID: \"1b0cb96d-c126-4872-a661-61efcba529c3\") " pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:00 crc kubenswrapper[4830]: I1203 23:13:00.055714 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:00 crc kubenswrapper[4830]: I1203 23:13:00.614801 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ls26g"] Dec 03 23:13:01 crc kubenswrapper[4830]: I1203 23:13:01.314715 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b0cb96d-c126-4872-a661-61efcba529c3" containerID="5913a2ca885e6f74e1b460ad91df3bc7d37f005c66c7bddeef9a722f4ed66e6c" exitCode=0 Dec 03 23:13:01 crc kubenswrapper[4830]: I1203 23:13:01.314829 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ls26g" event={"ID":"1b0cb96d-c126-4872-a661-61efcba529c3","Type":"ContainerDied","Data":"5913a2ca885e6f74e1b460ad91df3bc7d37f005c66c7bddeef9a722f4ed66e6c"} Dec 03 23:13:01 crc kubenswrapper[4830]: I1203 23:13:01.315054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ls26g" event={"ID":"1b0cb96d-c126-4872-a661-61efcba529c3","Type":"ContainerStarted","Data":"f725c410dd56ece428ce2a42e6122a50f657d2aa3bb39dfc53d56a92caf65362"} Dec 03 23:13:06 crc kubenswrapper[4830]: I1203 23:13:06.372788 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b0cb96d-c126-4872-a661-61efcba529c3" containerID="4fe46e5cece7484a4700d9b8c064cfc8ed844bdc1fdba4ea7589833106811623" exitCode=0 Dec 03 23:13:06 crc kubenswrapper[4830]: I1203 23:13:06.372872 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ls26g" event={"ID":"1b0cb96d-c126-4872-a661-61efcba529c3","Type":"ContainerDied","Data":"4fe46e5cece7484a4700d9b8c064cfc8ed844bdc1fdba4ea7589833106811623"} Dec 03 23:13:07 crc kubenswrapper[4830]: I1203 23:13:07.383705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ls26g" event={"ID":"1b0cb96d-c126-4872-a661-61efcba529c3","Type":"ContainerStarted","Data":"0373eb74b818dc7bc5f97b3c329593f3bd8fe4107f29bbfbdc90a8f17625e410"} Dec 03 23:13:07 crc kubenswrapper[4830]: I1203 23:13:07.411125 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ls26g" podStartSLOduration=2.90439131 podStartE2EDuration="8.411107971s" podCreationTimestamp="2025-12-03 23:12:59 +0000 UTC" firstStartedPulling="2025-12-03 23:13:01.316151267 +0000 UTC m=+4070.312612616" lastFinishedPulling="2025-12-03 23:13:06.822867928 +0000 UTC m=+4075.819329277" observedRunningTime="2025-12-03 23:13:07.409296782 +0000 UTC m=+4076.405758131" watchObservedRunningTime="2025-12-03 23:13:07.411107971 +0000 UTC m=+4076.407569320" Dec 03 23:13:08 crc kubenswrapper[4830]: I1203 23:13:08.337525 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:13:08 crc kubenswrapper[4830]: E1203 23:13:08.338158 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:13:10 crc kubenswrapper[4830]: I1203 23:13:10.056758 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:10 crc kubenswrapper[4830]: I1203 23:13:10.057094 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:10 crc kubenswrapper[4830]: I1203 23:13:10.129670 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.122429 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ls26g" Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.187302 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ls26g"] Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.243374 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.243673 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tkbfv" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="registry-server" containerID="cri-o://fe2f0c4ce7981412cab19b2c8cdc51891b19b179689c656709298438b89dcdf4" gracePeriod=2 Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.943188 4830 generic.go:334] "Generic (PLEG): container finished" podID="a87df9c3-5372-4399-877f-b132ae27b408" containerID="fe2f0c4ce7981412cab19b2c8cdc51891b19b179689c656709298438b89dcdf4" exitCode=0 Dec 03 23:13:20 crc kubenswrapper[4830]: I1203 23:13:20.943724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerDied","Data":"fe2f0c4ce7981412cab19b2c8cdc51891b19b179689c656709298438b89dcdf4"} Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.573862 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.658491 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnp5j\" (UniqueName: \"kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j\") pod \"a87df9c3-5372-4399-877f-b132ae27b408\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.658727 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content\") pod \"a87df9c3-5372-4399-877f-b132ae27b408\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.658796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities\") pod \"a87df9c3-5372-4399-877f-b132ae27b408\" (UID: \"a87df9c3-5372-4399-877f-b132ae27b408\") " Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.660217 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities" (OuterVolumeSpecName: "utilities") pod "a87df9c3-5372-4399-877f-b132ae27b408" (UID: "a87df9c3-5372-4399-877f-b132ae27b408"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.663693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j" (OuterVolumeSpecName: "kube-api-access-jnp5j") pod "a87df9c3-5372-4399-877f-b132ae27b408" (UID: "a87df9c3-5372-4399-877f-b132ae27b408"). InnerVolumeSpecName "kube-api-access-jnp5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.730702 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a87df9c3-5372-4399-877f-b132ae27b408" (UID: "a87df9c3-5372-4399-877f-b132ae27b408"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.761383 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.761446 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87df9c3-5372-4399-877f-b132ae27b408-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.761459 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnp5j\" (UniqueName: \"kubernetes.io/projected/a87df9c3-5372-4399-877f-b132ae27b408-kube-api-access-jnp5j\") on node \"crc\" DevicePath \"\"" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.957743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkbfv" event={"ID":"a87df9c3-5372-4399-877f-b132ae27b408","Type":"ContainerDied","Data":"38ee9bbb514fdd40955938d748ed5780d96805a14e9123877a79d3f8661f209a"} Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.958140 4830 scope.go:117] "RemoveContainer" containerID="fe2f0c4ce7981412cab19b2c8cdc51891b19b179689c656709298438b89dcdf4" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.958325 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkbfv" Dec 03 23:13:21 crc kubenswrapper[4830]: I1203 23:13:21.992051 4830 scope.go:117] "RemoveContainer" containerID="121cc330d07c4a493aac005dedf718fd9573082e57195ac30fa8c0e6fd2cf359" Dec 03 23:13:22 crc kubenswrapper[4830]: I1203 23:13:22.021494 4830 scope.go:117] "RemoveContainer" containerID="7428cf4d1066813b915c8018f2d0f437cbd075335f2b71d63115850be3d78561" Dec 03 23:13:22 crc kubenswrapper[4830]: I1203 23:13:22.043815 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 23:13:22 crc kubenswrapper[4830]: I1203 23:13:22.065652 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tkbfv"] Dec 03 23:13:22 crc kubenswrapper[4830]: I1203 23:13:22.336957 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:13:22 crc kubenswrapper[4830]: E1203 23:13:22.337363 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:13:23 crc kubenswrapper[4830]: I1203 23:13:23.350527 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87df9c3-5372-4399-877f-b132ae27b408" path="/var/lib/kubelet/pods/a87df9c3-5372-4399-877f-b132ae27b408/volumes" Dec 03 23:13:33 crc kubenswrapper[4830]: I1203 23:13:33.874665 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/init-config-reloader/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.093324 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/init-config-reloader/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.099530 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/config-reloader/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.153667 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cd27b153-5334-4329-91de-0e6941ae9e97/alertmanager/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.285232 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6444ff6d8d-kq894_9e17dece-7a57-4c56-b128-0316add6808f/barbican-api/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.287955 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6444ff6d8d-kq894_9e17dece-7a57-4c56-b128-0316add6808f/barbican-api-log/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.337419 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:13:34 crc kubenswrapper[4830]: E1203 23:13:34.337725 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.346455 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f8d56fd8-kd4r4_56894369-6e4d-451e-b510-60c1cad4b111/barbican-keystone-listener/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.537549 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c48ccfdc-rn6l7_f622ae76-ce43-4025-90eb-e609fbe2a004/barbican-worker/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.601068 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8f8d56fd8-kd4r4_56894369-6e4d-451e-b510-60c1cad4b111/barbican-keystone-listener-log/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.719609 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c48ccfdc-rn6l7_f622ae76-ce43-4025-90eb-e609fbe2a004/barbican-worker-log/0.log" Dec 03 23:13:34 crc kubenswrapper[4830]: I1203 23:13:34.854675 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pdcv7_be2984ba-f7dc-4271-ad04-f59c4ad3729d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.173979 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/ceilometer-central-agent/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.213725 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/ceilometer-notification-agent/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.236185 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/sg-core/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.240446 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e2cd0f55-f0aa-4fa0-b011-56fe37c2bfcf/proxy-httpd/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.413451 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_974ffdb3-a522-4b55-bdf0-b935f1378f20/cinder-api/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.503273 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_974ffdb3-a522-4b55-bdf0-b935f1378f20/cinder-api-log/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.638100 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_499f12cf-14df-48d9-b2ee-11691c85e1ed/probe/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.669298 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_499f12cf-14df-48d9-b2ee-11691c85e1ed/cinder-scheduler/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.805067 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_74cd90ac-c295-404a-afa5-d2977c397561/cloudkitty-api/0.log" Dec 03 23:13:35 crc kubenswrapper[4830]: I1203 23:13:35.891128 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_74cd90ac-c295-404a-afa5-d2977c397561/cloudkitty-api-log/0.log" Dec 03 23:13:36 crc kubenswrapper[4830]: I1203 23:13:36.084814 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_83a885d2-eea8-4a2c-83d7-a0a945597421/loki-compactor/0.log" Dec 03 23:13:36 crc kubenswrapper[4830]: I1203 23:13:36.097465 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-gvmnk_596170cd-57e9-4665-947d-ddb1549a38e0/loki-distributor/0.log" Dec 03 23:13:36 crc kubenswrapper[4830]: I1203 23:13:36.282422 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-gbw74_800e6ad6-526b-4134-b759-b9c0d884e3f5/gateway/0.log" Dec 03 23:13:36 crc kubenswrapper[4830]: I1203 23:13:36.414031 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-tbz5t_bb34bcb7-4a40-4d5b-a5ca-55571c61b999/gateway/0.log" Dec 03 23:13:36 crc kubenswrapper[4830]: I1203 23:13:36.515718 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_5af9b96f-fb0a-482b-9000-3b76a8c6c07c/loki-index-gateway/0.log" Dec 03 23:13:37 crc kubenswrapper[4830]: I1203 23:13:37.137070 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-qg9t9_de378972-d74f-44fe-a727-19bde47f0cbe/loki-query-frontend/0.log" Dec 03 23:13:37 crc kubenswrapper[4830]: I1203 23:13:37.170229 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_09564097-60ae-4b1d-bd03-ba8b5a254167/loki-ingester/0.log" Dec 03 23:13:37 crc kubenswrapper[4830]: I1203 23:13:37.558069 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5zttp_30423410-ddd7-4a94-8a3a-b20b6dfd64c8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:37 crc kubenswrapper[4830]: I1203 23:13:37.819834 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-xkwz4_6145c52f-1582-4092-a44a-cc665216b2af/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:37 crc kubenswrapper[4830]: I1203 23:13:37.827995 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-548665d79b-xntf7_2d6f2070-f2d3-47d2-b43f-dfdaed23e03b/loki-querier/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.060849 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/init/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.298454 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/init/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.312874 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zh8fj_1f140812-ce94-43b3-bdec-466d8a3d2417/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.449234 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pzsz8_f5a1439e-ab65-4263-bbfc-09933a4db924/dnsmasq-dns/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.539357 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_063860c7-63d8-4cec-b4f1-b6b501779d90/glance-httpd/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.630553 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_063860c7-63d8-4cec-b4f1-b6b501779d90/glance-log/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.782496 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62/glance-httpd/0.log" Dec 03 23:13:38 crc kubenswrapper[4830]: I1203 23:13:38.822256 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ed4f6693-f6bc-4a8c-b836-f0aaea3bfc62/glance-log/0.log" Dec 03 23:13:39 crc kubenswrapper[4830]: I1203 23:13:39.000131 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v5wqs_1c532c81-40fc-4058-bb22-abec161c538a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:39 crc kubenswrapper[4830]: I1203 23:13:39.501396 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kk4xf_4fc52ab9-68d4-4a61-a92f-8de75a563adc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:39 crc kubenswrapper[4830]: I1203 23:13:39.720942 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29413381-s6q5j_29ffb417-4cef-4a91-a9f7-74fcc99e14df/keystone-cron/0.log" Dec 03 23:13:39 crc kubenswrapper[4830]: I1203 23:13:39.909232 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4e04514b-2828-4cd0-9fa7-7d5a970957a0/kube-state-metrics/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.010469 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79b896c7bd-gsfgm_ca47aca8-81d5-4c28-b82b-147b7835a87d/keystone-api/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.128310 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j4wf7_dcb8ca11-3bdf-4f85-9f2e-67fe34a81c51/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.163557 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_6be5d9e4-3136-4c97-89ef-9376c1ef588c/cloudkitty-proc/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.508745 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57d569655f-t92g7_6d9377f1-1fe5-4451-8224-b5e9e253efa5/neutron-httpd/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.547718 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-g48mz_f9da9e03-f9ca-47a1-902a-c8f1b66b3fbf/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:40 crc kubenswrapper[4830]: I1203 23:13:40.585976 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57d569655f-t92g7_6d9377f1-1fe5-4451-8224-b5e9e253efa5/neutron-api/0.log" Dec 03 23:13:41 crc kubenswrapper[4830]: I1203 23:13:41.188463 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22dd5f84-7ae4-442e-b6a1-dd27b2d3875d/nova-api-log/0.log" Dec 03 23:13:41 crc kubenswrapper[4830]: I1203 23:13:41.238132 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4085042f-6768-44ca-be35-b0a9c78655fa/nova-cell0-conductor-conductor/0.log" Dec 03 23:13:41 crc kubenswrapper[4830]: I1203 23:13:41.563190 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_22dd5f84-7ae4-442e-b6a1-dd27b2d3875d/nova-api-api/0.log" Dec 03 23:13:41 crc kubenswrapper[4830]: I1203 23:13:41.943951 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1eea67ad-cea1-4acf-a451-76a490e27693/nova-cell1-conductor-conductor/0.log" Dec 03 23:13:41 crc kubenswrapper[4830]: I1203 23:13:41.966651 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_33e1d4ce-81f4-4a02-8ac5-384686943b19/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.111171 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cjj5b_fde89bd5-aa9c-44c2-b854-696c3e0f50e7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.244790 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b44e63ea-f87b-48f3-8af7-bc3e35ce5265/nova-metadata-log/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.582295 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4dd8f4ca-85be-49df-b95a-adb609cbbff2/nova-scheduler-scheduler/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.626812 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/mysql-bootstrap/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.966834 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/mysql-bootstrap/0.log" Dec 03 23:13:42 crc kubenswrapper[4830]: I1203 23:13:42.971822 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21e1ac03-6466-4663-bff2-68ff2cc7801d/galera/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.167776 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/mysql-bootstrap/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.589089 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b44e63ea-f87b-48f3-8af7-bc3e35ce5265/nova-metadata-metadata/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.714187 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/galera/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.720913 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_85bd20b1-76d6-4238-be14-1c5891d6bbd8/mysql-bootstrap/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.824810 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_670e6335-d34c-46e4-8b4d-89dbd65c35a7/openstackclient/0.log" Dec 03 23:13:43 crc kubenswrapper[4830]: I1203 23:13:43.971109 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5r2qg_3af15ba1-ae94-416f-afe7-534d88ee8a64/openstack-network-exporter/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.118938 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nlxm7_6bdf507c-05be-4df8-8c33-85f15c05237c/ovn-controller/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.250251 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server-init/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.490244 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovs-vswitchd/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.506756 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server-init/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.535331 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8j27r_b0d02b19-e65a-4c45-b658-a34c69cdf74e/ovsdb-server/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.743451 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_516cc148-c477-46f0-bc3e-475ad6003486/openstack-network-exporter/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.744041 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2wbdq_f036d299-7239-400c-b3c4-f20ec8ed1f26/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:44 crc kubenswrapper[4830]: I1203 23:13:44.821640 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_516cc148-c477-46f0-bc3e-475ad6003486/ovn-northd/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.027440 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63965380-d86f-4abf-9c9c-4d5a25ad6754/ovsdbserver-nb/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.063971 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63965380-d86f-4abf-9c9c-4d5a25ad6754/openstack-network-exporter/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.252353 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c212e9c4-4562-48b2-9be8-bf00f52a076a/ovsdbserver-sb/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.316874 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c212e9c4-4562-48b2-9be8-bf00f52a076a/openstack-network-exporter/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.531521 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c66b45784-w46xw_7956cb25-0b77-4822-b26e-dd512559f30b/placement-api/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.551038 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c66b45784-w46xw_7956cb25-0b77-4822-b26e-dd512559f30b/placement-log/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.569682 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/init-config-reloader/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.777056 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/init-config-reloader/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.804647 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/prometheus/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.856627 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/thanos-sidecar/0.log" Dec 03 23:13:45 crc kubenswrapper[4830]: I1203 23:13:45.891484 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_86b4b71c-3ed3-4413-a90b-4523b4a5c549/config-reloader/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.053828 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/setup-container/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.245727 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/setup-container/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.373659 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6fb3b204-2b5a-4dcb-a278-d58ea0dce557/rabbitmq/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.389424 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/setup-container/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.621151 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/setup-container/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.647676 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5a2c4f61-6b61-4907-8601-6eea8065d2f6/rabbitmq/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.654409 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l5vfd_2153e26c-6b48-4353-9ee3-ac526f0d76b2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.865838 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6slmb_18583450-269e-412a-99f5-203326569e83/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:46 crc kubenswrapper[4830]: I1203 23:13:46.963902 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n6b86_c3a26e72-8b75-423c-a151-d576eb7a4128/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.213028 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zhhk9_922398e7-7669-44c9-89e4-cf9cfea422c8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.281868 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-wlhvr_79dc5285-4e46-46d1-a708-a1f1623b7448/ssh-known-hosts-edpm-deployment/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.545679 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b449dfbc-dfzlc_614c1318-6703-47ae-89e5-e9b2dd9758e3/proxy-server/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.633770 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b449dfbc-dfzlc_614c1318-6703-47ae-89e5-e9b2dd9758e3/proxy-httpd/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.707863 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wvw9q_19889054-44cb-47a4-a604-a319f1bd25af/swift-ring-rebalance/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.788426 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-auditor/0.log" Dec 03 23:13:47 crc kubenswrapper[4830]: I1203 23:13:47.849761 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-reaper/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.007948 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-server/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.014616 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/account-replicator/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.043142 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-auditor/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.056674 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-replicator/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.244615 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-server/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.263231 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-auditor/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.293385 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/container-updater/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.296998 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-expirer/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.336630 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:13:48 crc kubenswrapper[4830]: E1203 23:13:48.336995 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.465589 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-updater/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.501153 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-replicator/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.514125 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/object-server/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.567071 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/rsync/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.757955 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eb07d691-fbbb-46d8-bc0a-3eb0f35bf64e/swift-recon-cron/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.775282 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pf9wl_3cbad56b-f578-4ad4-bdb4-13c72261814d/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.967065 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a8c3b6fd-772d-4354-9076-a56d78d4ad0a/tempest-tests-tempest-tests-runner/0.log" Dec 03 23:13:48 crc kubenswrapper[4830]: I1203 23:13:48.978842 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ec873385-77d4-4549-8670-b2f507bd999b/test-operator-logs-container/0.log" Dec 03 23:13:49 crc kubenswrapper[4830]: I1203 23:13:49.204474 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kc2rw_14a22d3c-4a83-41b5-b9e7-7862b7b62ab5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 23:13:53 crc kubenswrapper[4830]: I1203 23:13:53.866893 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_deb3672e-3fb5-4549-ae27-6f7402c1e3d8/memcached/0.log" Dec 03 23:14:03 crc kubenswrapper[4830]: I1203 23:14:03.338308 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:14:04 crc kubenswrapper[4830]: I1203 23:14:04.429911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a"} Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.076684 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qnp24_cca79891-68e7-4827-8da4-c0570dbca762/kube-rbac-proxy/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.197207 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qnp24_cca79891-68e7-4827-8da4-c0570dbca762/manager/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.220148 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rhhdr_0ecd210b-fb48-42fe-b161-6583d913b6f8/kube-rbac-proxy/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.326022 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-rhhdr_0ecd210b-fb48-42fe-b161-6583d913b6f8/manager/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.440594 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.599841 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.627650 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.669374 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.819218 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/util/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.822313 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/pull/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.839585 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d2b89cb9716f44b9c475cfb7e71244f393fe86fe2707a7d4113f58c65as42c9_53e3bc06-3d65-43f4-a54f-638f4871c97f/extract/0.log" Dec 03 23:14:17 crc kubenswrapper[4830]: I1203 23:14:17.994857 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-zgtdz_22f7d8a7-b9bf-40ca-aca3-13a370558f38/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.004260 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-zgtdz_22f7d8a7-b9bf-40ca-aca3-13a370558f38/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.056241 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qcz7k_e8bc8bcd-fde2-43fd-86ae-814182f2f5ac/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.252640 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-nggxg_5e5620ec-6ef3-47fc-b88b-06a2f2849b48/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.289958 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-nggxg_5e5620ec-6ef3-47fc-b88b-06a2f2849b48/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.311402 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qcz7k_e8bc8bcd-fde2-43fd-86ae-814182f2f5ac/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.450435 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jj85k_a5f4a0b7-d118-45b5-ab87-9f03413d4671/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.480183 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-jj85k_a5f4a0b7-d118-45b5-ab87-9f03413d4671/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.629888 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pd68x_4cf3851e-6624-48c2-aa71-e799e6b6b685/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.765426 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-5t4bj_b0dc8ce5-ac38-4bac-8026-5ca446e16340/kube-rbac-proxy/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.816332 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-5t4bj_b0dc8ce5-ac38-4bac-8026-5ca446e16340/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.845949 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-pd68x_4cf3851e-6624-48c2-aa71-e799e6b6b685/manager/0.log" Dec 03 23:14:18 crc kubenswrapper[4830]: I1203 23:14:18.963244 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4fshq_e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.052501 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4fshq_e9090eeb-2bd9-4c1d-b3f2-eadc5f6d53ae/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.138294 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-8wlrd_01175cc5-e6fa-4e26-b76b-6b7e2a71d51a/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.147958 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-8wlrd_01175cc5-e6fa-4e26-b76b-6b7e2a71d51a/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.268097 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48cc9_325f811a-891b-48ae-bde4-a72e7580c925/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.350829 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-48cc9_325f811a-891b-48ae-bde4-a72e7580c925/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.468947 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v2gwm_90ea4083-18d1-4ace-bcc6-81489c41f117/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.585452 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-v2gwm_90ea4083-18d1-4ace-bcc6-81489c41f117/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.603560 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-526ng_28b1972b-42aa-4470-8be6-240b219e5975/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.789268 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-526ng_28b1972b-42aa-4470-8be6-240b219e5975/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.819796 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rmd7h_d18e5b1e-653d-4c0e-928f-a2d60b2af855/kube-rbac-proxy/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.855681 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rmd7h_d18e5b1e-653d-4c0e-928f-a2d60b2af855/manager/0.log" Dec 03 23:14:19 crc kubenswrapper[4830]: I1203 23:14:19.988752 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h_7d1112c1-ffce-45e1-94a4-3aad2ae50fe5/kube-rbac-proxy/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.051888 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zz78h_7d1112c1-ffce-45e1-94a4-3aad2ae50fe5/manager/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.451443 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8545cc8fb-sm7mq_dd670738-8808-4f9b-8fea-98c4ab57fb06/operator/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.553400 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pm2p5_e3d5735a-003c-4c35-9239-c80e7e6dbafc/registry-server/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.758402 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-vwd82_908c7892-9ff8-4d17-86ea-2daf891ea90b/kube-rbac-proxy/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.808427 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-vwd82_908c7892-9ff8-4d17-86ea-2daf891ea90b/manager/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.950046 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rs6wd_55d8936f-55fc-4a92-b8a2-c393b6b46eeb/kube-rbac-proxy/0.log" Dec 03 23:14:20 crc kubenswrapper[4830]: I1203 23:14:20.990614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rs6wd_55d8936f-55fc-4a92-b8a2-c393b6b46eeb/manager/0.log" Dec 03 23:14:21 crc kubenswrapper[4830]: I1203 23:14:21.397328 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bb8cf96cb-6vrpp_0c670280-553c-4251-ac28-04fdd66313a7/manager/0.log" Dec 03 23:14:21 crc kubenswrapper[4830]: I1203 23:14:21.853844 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h5tll_27ed445d-9111-479a-8dc5-5808e0af45be/operator/0.log" Dec 03 23:14:21 crc kubenswrapper[4830]: I1203 23:14:21.914848 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-sqmr8_84a09470-19ba-4bef-b0de-1fa4df1561ae/manager/0.log" Dec 03 23:14:21 crc kubenswrapper[4830]: I1203 23:14:21.921971 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-sqmr8_84a09470-19ba-4bef-b0de-1fa4df1561ae/kube-rbac-proxy/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.063231 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-59779d887b-2cqbq_c0f0376d-c348-4b7b-b4e1-f8717ea05299/kube-rbac-proxy/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.139605 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-7bhdq_d5a49c34-e03d-49b5-a5a8-507af8ce99be/kube-rbac-proxy/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.321135 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-7bhdq_d5a49c34-e03d-49b5-a5a8-507af8ce99be/manager/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.443462 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dtdgr_dec06c39-2a96-4fc6-a2e2-ad865fc394d9/kube-rbac-proxy/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.565113 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-dtdgr_dec06c39-2a96-4fc6-a2e2-ad865fc394d9/manager/0.log" Dec 03 23:14:22 crc kubenswrapper[4830]: I1203 23:14:22.573546 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-59779d887b-2cqbq_c0f0376d-c348-4b7b-b4e1-f8717ea05299/manager/0.log" Dec 03 23:14:44 crc kubenswrapper[4830]: I1203 23:14:44.780209 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c2sm5_c8f89cbd-cef8-468a-973d-6e513dcb4e09/control-plane-machine-set-operator/0.log" Dec 03 23:14:44 crc kubenswrapper[4830]: I1203 23:14:44.971870 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c8f8r_b9ff0d92-ab2f-4815-9659-7b4507d64344/kube-rbac-proxy/0.log" Dec 03 23:14:45 crc kubenswrapper[4830]: I1203 23:14:45.011970 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c8f8r_b9ff0d92-ab2f-4815-9659-7b4507d64344/machine-api-operator/0.log" Dec 03 23:14:58 crc kubenswrapper[4830]: I1203 23:14:58.916484 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-zbjs6_361b0a92-2587-4f96-a941-f9c50fd46e10/cert-manager-controller/0.log" Dec 03 23:14:59 crc kubenswrapper[4830]: I1203 23:14:59.028992 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-5c749_d84ca934-7bba-4889-b4a6-feec21575832/cert-manager-cainjector/0.log" Dec 03 23:14:59 crc kubenswrapper[4830]: I1203 23:14:59.125896 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sr4sh_812edfa4-0a35-4dce-b14c-3addb5812eb7/cert-manager-webhook/0.log" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.181311 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc"] Dec 03 23:15:00 crc kubenswrapper[4830]: E1203 23:15:00.182139 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="registry-server" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.182159 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="registry-server" Dec 03 23:15:00 crc kubenswrapper[4830]: E1203 23:15:00.182200 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="extract-utilities" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.182209 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="extract-utilities" Dec 03 23:15:00 crc kubenswrapper[4830]: E1203 23:15:00.182233 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="extract-content" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.182243 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="extract-content" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.182537 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87df9c3-5372-4399-877f-b132ae27b408" containerName="registry-server" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.183500 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.185628 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.185837 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.213637 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc"] Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.379081 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.379159 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjgdw\" (UniqueName: \"kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.379207 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.482681 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.482743 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjgdw\" (UniqueName: \"kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.482781 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.483562 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.488157 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.498377 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjgdw\" (UniqueName: \"kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw\") pod \"collect-profiles-29413395-8mnkc\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:00 crc kubenswrapper[4830]: I1203 23:15:00.511789 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:01 crc kubenswrapper[4830]: I1203 23:15:01.079086 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc"] Dec 03 23:15:01 crc kubenswrapper[4830]: W1203 23:15:01.090763 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac73698d_72e8_4b2b_8aa5_bcdf57ed0146.slice/crio-9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290 WatchSource:0}: Error finding container 9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290: Status 404 returned error can't find the container with id 9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290 Dec 03 23:15:01 crc kubenswrapper[4830]: I1203 23:15:01.973014 4830 generic.go:334] "Generic (PLEG): container finished" podID="ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" containerID="6dab83e19d797b42d47e37e874f9cf0c3d19f8e2bb65be33150ada59d158f898" exitCode=0 Dec 03 23:15:01 crc kubenswrapper[4830]: I1203 23:15:01.973120 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" event={"ID":"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146","Type":"ContainerDied","Data":"6dab83e19d797b42d47e37e874f9cf0c3d19f8e2bb65be33150ada59d158f898"} Dec 03 23:15:01 crc kubenswrapper[4830]: I1203 23:15:01.973299 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" event={"ID":"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146","Type":"ContainerStarted","Data":"9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290"} Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.145097 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.259074 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjgdw\" (UniqueName: \"kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw\") pod \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.259136 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume\") pod \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.259195 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume\") pod \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\" (UID: \"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146\") " Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.260090 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" (UID: "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.265707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" (UID: "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.265715 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw" (OuterVolumeSpecName: "kube-api-access-bjgdw") pod "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" (UID: "ac73698d-72e8-4b2b-8aa5-bcdf57ed0146"). InnerVolumeSpecName "kube-api-access-bjgdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.361874 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjgdw\" (UniqueName: \"kubernetes.io/projected/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-kube-api-access-bjgdw\") on node \"crc\" DevicePath \"\"" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.361919 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:15:04 crc kubenswrapper[4830]: I1203 23:15:04.361932 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac73698d-72e8-4b2b-8aa5-bcdf57ed0146-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.000916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" event={"ID":"ac73698d-72e8-4b2b-8aa5-bcdf57ed0146","Type":"ContainerDied","Data":"9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290"} Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.001180 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd427a500e92a34c3e10fdd1a1fb7a014ffb3a457fdb5dc45f0c70780b6f290" Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.001005 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8mnkc" Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.219139 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4"] Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.229858 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-lfdq4"] Dec 03 23:15:05 crc kubenswrapper[4830]: I1203 23:15:05.353007 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b0d2fb-1190-46dc-ad5f-f6435077bf11" path="/var/lib/kubelet/pods/f7b0d2fb-1190-46dc-ad5f-f6435077bf11/volumes" Dec 03 23:15:14 crc kubenswrapper[4830]: I1203 23:15:14.474030 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-8dnq8_603c63d8-e5d2-428b-925d-aab17f1889dc/nmstate-console-plugin/0.log" Dec 03 23:15:14 crc kubenswrapper[4830]: I1203 23:15:14.637784 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s66zt_47b33d94-20d4-4640-acb2-c25aa2903bd1/nmstate-handler/0.log" Dec 03 23:15:14 crc kubenswrapper[4830]: I1203 23:15:14.728972 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-krngn_22fd491b-eed1-4558-86bc-ae1f601fcdd0/nmstate-metrics/0.log" Dec 03 23:15:14 crc kubenswrapper[4830]: I1203 23:15:14.735414 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-krngn_22fd491b-eed1-4558-86bc-ae1f601fcdd0/kube-rbac-proxy/0.log" Dec 03 23:15:14 crc kubenswrapper[4830]: I1203 23:15:14.940447 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-pncxz_de8f51ca-5084-4af1-87dc-715b869006d0/nmstate-operator/0.log" Dec 03 23:15:15 crc kubenswrapper[4830]: I1203 23:15:15.006822 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8pqtc_a8b2e328-2397-45e1-a593-5f5094799015/nmstate-webhook/0.log" Dec 03 23:15:28 crc kubenswrapper[4830]: I1203 23:15:28.069232 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/kube-rbac-proxy/0.log" Dec 03 23:15:28 crc kubenswrapper[4830]: I1203 23:15:28.098375 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/manager/0.log" Dec 03 23:15:30 crc kubenswrapper[4830]: I1203 23:15:30.256408 4830 scope.go:117] "RemoveContainer" containerID="759271d4429d557e2e0f563bf3248ff047c19d3c74cec15f53039fef9088820e" Dec 03 23:15:43 crc kubenswrapper[4830]: I1203 23:15:43.517161 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cc4b2_36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d/kube-rbac-proxy/0.log" Dec 03 23:15:43 crc kubenswrapper[4830]: I1203 23:15:43.715524 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cc4b2_36dfa7fe-861b-4f3f-8f4f-24a734b2ad9d/controller/0.log" Dec 03 23:15:43 crc kubenswrapper[4830]: I1203 23:15:43.799272 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.044487 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.076178 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.078745 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.079786 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.235324 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.279298 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.310102 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.357384 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.550459 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-metrics/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.562143 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-reloader/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.564398 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/cp-frr-files/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.638854 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/controller/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.759720 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/kube-rbac-proxy/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.806356 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/frr-metrics/0.log" Dec 03 23:15:44 crc kubenswrapper[4830]: I1203 23:15:44.880262 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/kube-rbac-proxy-frr/0.log" Dec 03 23:15:45 crc kubenswrapper[4830]: I1203 23:15:45.023459 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/reloader/0.log" Dec 03 23:15:45 crc kubenswrapper[4830]: I1203 23:15:45.121488 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-dj5pn_ea33811b-aa63-48d4-b9f3-9446ca919e3c/frr-k8s-webhook-server/0.log" Dec 03 23:15:45 crc kubenswrapper[4830]: I1203 23:15:45.288272 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6546545fd8-lr9kg_3898c0c7-fe9f-4446-803f-01d5c019b406/manager/0.log" Dec 03 23:15:45 crc kubenswrapper[4830]: I1203 23:15:45.473213 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79d7c9f765-w2ltt_a308b4ff-4a2a-4ab4-9171-9dc572b71c29/webhook-server/0.log" Dec 03 23:15:45 crc kubenswrapper[4830]: I1203 23:15:45.690284 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q9rp_194dbe3b-afec-4576-b9ac-8810ad9c9482/kube-rbac-proxy/0.log" Dec 03 23:15:46 crc kubenswrapper[4830]: I1203 23:15:46.278599 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q9rp_194dbe3b-afec-4576-b9ac-8810ad9c9482/speaker/0.log" Dec 03 23:15:46 crc kubenswrapper[4830]: I1203 23:15:46.334211 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-467ht_947e6799-183e-4ab9-8dac-bc02f1232c6e/frr/0.log" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.384320 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:15:52 crc kubenswrapper[4830]: E1203 23:15:52.385371 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" containerName="collect-profiles" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.385389 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" containerName="collect-profiles" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.385684 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac73698d-72e8-4b2b-8aa5-bcdf57ed0146" containerName="collect-profiles" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.387705 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.398864 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.513614 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.513873 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.514464 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb586\" (UniqueName: \"kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.617651 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.617792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.618175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.618210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb586\" (UniqueName: \"kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.618532 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.642876 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb586\" (UniqueName: \"kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586\") pod \"redhat-operators-px5gs\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:52 crc kubenswrapper[4830]: I1203 23:15:52.718616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:15:53 crc kubenswrapper[4830]: I1203 23:15:53.198378 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:15:53 crc kubenswrapper[4830]: I1203 23:15:53.464476 4830 generic.go:334] "Generic (PLEG): container finished" podID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerID="4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f" exitCode=0 Dec 03 23:15:53 crc kubenswrapper[4830]: I1203 23:15:53.464724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerDied","Data":"4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f"} Dec 03 23:15:53 crc kubenswrapper[4830]: I1203 23:15:53.464954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerStarted","Data":"5279f2d0a55d86e88be0f89933e36cfdf25d7520d7e0ac9f08267d54cd4d8c7e"} Dec 03 23:15:54 crc kubenswrapper[4830]: I1203 23:15:54.477167 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerStarted","Data":"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153"} Dec 03 23:15:57 crc kubenswrapper[4830]: I1203 23:15:57.506055 4830 generic.go:334] "Generic (PLEG): container finished" podID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerID="f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153" exitCode=0 Dec 03 23:15:57 crc kubenswrapper[4830]: I1203 23:15:57.506537 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerDied","Data":"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153"} Dec 03 23:15:58 crc kubenswrapper[4830]: I1203 23:15:58.528946 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerStarted","Data":"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6"} Dec 03 23:15:58 crc kubenswrapper[4830]: I1203 23:15:58.560693 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-px5gs" podStartSLOduration=1.887684715 podStartE2EDuration="6.560650237s" podCreationTimestamp="2025-12-03 23:15:52 +0000 UTC" firstStartedPulling="2025-12-03 23:15:53.46648629 +0000 UTC m=+4242.462947639" lastFinishedPulling="2025-12-03 23:15:58.139451822 +0000 UTC m=+4247.135913161" observedRunningTime="2025-12-03 23:15:58.545445142 +0000 UTC m=+4247.541906501" watchObservedRunningTime="2025-12-03 23:15:58.560650237 +0000 UTC m=+4247.557111596" Dec 03 23:16:01 crc kubenswrapper[4830]: I1203 23:16:01.742996 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:16:01 crc kubenswrapper[4830]: I1203 23:16:01.925404 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:16:01 crc kubenswrapper[4830]: I1203 23:16:01.933588 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:16:01 crc kubenswrapper[4830]: I1203 23:16:01.972005 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.410451 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/util/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.438918 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/pull/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.439037 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694l9wsd_ba6a7d25-7c4b-4587-bfbe-8197f7be7eed/extract/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.618977 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.719334 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.719599 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.860582 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.861576 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:16:02 crc kubenswrapper[4830]: I1203 23:16:02.877256 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.054939 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/util/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.055447 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/pull/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.150556 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303qglx5_b3a9c045-3aac-47e1-ae39-645d33985f37/extract/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.316730 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.522458 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.523146 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.575925 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.778202 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-px5gs" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="registry-server" probeResult="failure" output=< Dec 03 23:16:03 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Dec 03 23:16:03 crc kubenswrapper[4830]: > Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.881304 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/util/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.931384 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/pull/0.log" Dec 03 23:16:03 crc kubenswrapper[4830]: I1203 23:16:03.978911 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvc8b4_dd6b638e-c924-4e57-9e1f-ec4da2ef3db1/extract/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.120874 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.285952 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.310963 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.376659 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.562897 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/pull/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.616538 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/extract/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.672907 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rnrht_dfc36678-47ec-4ef3-bba7-a1ddec002156/util/0.log" Dec 03 23:16:04 crc kubenswrapper[4830]: I1203 23:16:04.810328 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.003178 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.042472 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.057350 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.273199 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/util/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.348546 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/extract/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.381795 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.385697 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838qdzg_aa7fb560-5dc2-47f6-9d59-2f3f4ed47f3a/pull/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.534305 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.549222 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.598118 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.762347 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-content/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.800248 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/extract-utilities/0.log" Dec 03 23:16:05 crc kubenswrapper[4830]: I1203 23:16:05.848657 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-utilities/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.157048 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-content/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.204676 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-utilities/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.225003 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-content/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.269656 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tf9xv_a883c6ca-b81f-4954-9606-552fb6ee7b29/registry-server/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.417102 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-content/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.424383 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/extract-utilities/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.536145 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7s8ph_96069a0e-4ce1-4f68-835c-0a0110f36b2c/marketplace-operator/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.568945 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ls26g_1b0cb96d-c126-4872-a661-61efcba529c3/registry-server/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.735811 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.892395 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.897106 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:16:06 crc kubenswrapper[4830]: I1203 23:16:06.897668 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.147808 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-utilities/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.148005 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.209723 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/extract-content/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.263527 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m7gkr_3387e2a9-0110-4c74-adf2-87587a00adf8/registry-server/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.604760 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.613932 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.637343 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.851371 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-content/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.876635 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/extract-utilities/0.log" Dec 03 23:16:07 crc kubenswrapper[4830]: I1203 23:16:07.969023 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-utilities/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.147593 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-utilities/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.213056 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-content/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.259992 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-content/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.426240 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-utilities/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.511053 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mc5ls_85f85a6a-507e-4744-91ee-1e9471e607c4/registry-server/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.546085 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/extract-content/0.log" Dec 03 23:16:08 crc kubenswrapper[4830]: I1203 23:16:08.552556 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-px5gs_510bea8f-d0af-4eeb-bb35-03b994c50e02/registry-server/0.log" Dec 03 23:16:12 crc kubenswrapper[4830]: I1203 23:16:12.773156 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:12 crc kubenswrapper[4830]: I1203 23:16:12.822218 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:13 crc kubenswrapper[4830]: I1203 23:16:13.009433 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:16:14 crc kubenswrapper[4830]: I1203 23:16:14.671740 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-px5gs" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="registry-server" containerID="cri-o://53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6" gracePeriod=2 Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.285124 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.329110 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities\") pod \"510bea8f-d0af-4eeb-bb35-03b994c50e02\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.329630 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb586\" (UniqueName: \"kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586\") pod \"510bea8f-d0af-4eeb-bb35-03b994c50e02\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.329780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content\") pod \"510bea8f-d0af-4eeb-bb35-03b994c50e02\" (UID: \"510bea8f-d0af-4eeb-bb35-03b994c50e02\") " Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.330115 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities" (OuterVolumeSpecName: "utilities") pod "510bea8f-d0af-4eeb-bb35-03b994c50e02" (UID: "510bea8f-d0af-4eeb-bb35-03b994c50e02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.330433 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.335961 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586" (OuterVolumeSpecName: "kube-api-access-kb586") pod "510bea8f-d0af-4eeb-bb35-03b994c50e02" (UID: "510bea8f-d0af-4eeb-bb35-03b994c50e02"). InnerVolumeSpecName "kube-api-access-kb586". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.432682 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb586\" (UniqueName: \"kubernetes.io/projected/510bea8f-d0af-4eeb-bb35-03b994c50e02-kube-api-access-kb586\") on node \"crc\" DevicePath \"\"" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.459036 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "510bea8f-d0af-4eeb-bb35-03b994c50e02" (UID: "510bea8f-d0af-4eeb-bb35-03b994c50e02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.534738 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/510bea8f-d0af-4eeb-bb35-03b994c50e02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.685808 4830 generic.go:334] "Generic (PLEG): container finished" podID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerID="53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6" exitCode=0 Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.685855 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerDied","Data":"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6"} Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.685877 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-px5gs" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.685905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-px5gs" event={"ID":"510bea8f-d0af-4eeb-bb35-03b994c50e02","Type":"ContainerDied","Data":"5279f2d0a55d86e88be0f89933e36cfdf25d7520d7e0ac9f08267d54cd4d8c7e"} Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.685926 4830 scope.go:117] "RemoveContainer" containerID="53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.703553 4830 scope.go:117] "RemoveContainer" containerID="f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.728200 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.735658 4830 scope.go:117] "RemoveContainer" containerID="4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.740075 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-px5gs"] Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.780300 4830 scope.go:117] "RemoveContainer" containerID="53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6" Dec 03 23:16:15 crc kubenswrapper[4830]: E1203 23:16:15.780709 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6\": container with ID starting with 53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6 not found: ID does not exist" containerID="53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.780752 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6"} err="failed to get container status \"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6\": rpc error: code = NotFound desc = could not find container \"53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6\": container with ID starting with 53cfe1164529f8340a9e4a60d8bd07c731760a3714a1ad0790add72ff01747d6 not found: ID does not exist" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.780781 4830 scope.go:117] "RemoveContainer" containerID="f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153" Dec 03 23:16:15 crc kubenswrapper[4830]: E1203 23:16:15.781028 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153\": container with ID starting with f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153 not found: ID does not exist" containerID="f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.781056 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153"} err="failed to get container status \"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153\": rpc error: code = NotFound desc = could not find container \"f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153\": container with ID starting with f6d195e3782c1604b7ec1d551a6e6bbb1431488c352d38799bee59be48c9d153 not found: ID does not exist" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.781075 4830 scope.go:117] "RemoveContainer" containerID="4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f" Dec 03 23:16:15 crc kubenswrapper[4830]: E1203 23:16:15.781342 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f\": container with ID starting with 4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f not found: ID does not exist" containerID="4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f" Dec 03 23:16:15 crc kubenswrapper[4830]: I1203 23:16:15.781372 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f"} err="failed to get container status \"4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f\": rpc error: code = NotFound desc = could not find container \"4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f\": container with ID starting with 4c58ca30ef3b8fc682d8d63169a97fc8f2c0c1122deeb2d2e74b888fb968088f not found: ID does not exist" Dec 03 23:16:17 crc kubenswrapper[4830]: I1203 23:16:17.349713 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" path="/var/lib/kubelet/pods/510bea8f-d0af-4eeb-bb35-03b994c50e02/volumes" Dec 03 23:16:21 crc kubenswrapper[4830]: I1203 23:16:21.513074 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-qzckh_eb61183f-1e00-4056-9cf6-d1503c208d29/prometheus-operator/0.log" Dec 03 23:16:21 crc kubenswrapper[4830]: I1203 23:16:21.655137 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-drjdl_17fb9e38-36ef-4709-8d72-71e4ca6fa8ad/prometheus-operator-admission-webhook/0.log" Dec 03 23:16:21 crc kubenswrapper[4830]: I1203 23:16:21.725520 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b4bc5c798-s26kt_35bdd835-3ab4-4828-bd70-6d3f0df5131f/prometheus-operator-admission-webhook/0.log" Dec 03 23:16:21 crc kubenswrapper[4830]: I1203 23:16:21.847024 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-49j7v_a167735b-f973-4627-b731-0d4ab1458916/operator/0.log" Dec 03 23:16:21 crc kubenswrapper[4830]: I1203 23:16:21.971033 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-zjhqs_90d67c7a-8ed4-4d7c-99c6-854fcc1c0c85/perses-operator/0.log" Dec 03 23:16:26 crc kubenswrapper[4830]: I1203 23:16:26.681127 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:16:26 crc kubenswrapper[4830]: I1203 23:16:26.681780 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:16:36 crc kubenswrapper[4830]: I1203 23:16:36.393905 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/kube-rbac-proxy/0.log" Dec 03 23:16:36 crc kubenswrapper[4830]: I1203 23:16:36.414594 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7dcf7ddf84-vj9kj_aac5b4ce-2952-49d3-81bc-4ace758e5367/manager/0.log" Dec 03 23:16:56 crc kubenswrapper[4830]: I1203 23:16:56.681213 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:16:56 crc kubenswrapper[4830]: I1203 23:16:56.681843 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:17:26 crc kubenswrapper[4830]: I1203 23:17:26.680934 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:17:26 crc kubenswrapper[4830]: I1203 23:17:26.681575 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:17:26 crc kubenswrapper[4830]: I1203 23:17:26.681626 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 23:17:26 crc kubenswrapper[4830]: I1203 23:17:26.682749 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:17:26 crc kubenswrapper[4830]: I1203 23:17:26.682823 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a" gracePeriod=600 Dec 03 23:17:27 crc kubenswrapper[4830]: I1203 23:17:27.399226 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a" exitCode=0 Dec 03 23:17:27 crc kubenswrapper[4830]: I1203 23:17:27.399282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a"} Dec 03 23:17:27 crc kubenswrapper[4830]: I1203 23:17:27.399319 4830 scope.go:117] "RemoveContainer" containerID="d2aff347062a3ea35fea1481c90d83684c74a20837fa97a1e81438f4e66c759a" Dec 03 23:17:28 crc kubenswrapper[4830]: I1203 23:17:28.410060 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerStarted","Data":"954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042"} Dec 03 23:18:18 crc kubenswrapper[4830]: I1203 23:18:18.918262 4830 generic.go:334] "Generic (PLEG): container finished" podID="490687ea-7427-474e-9684-0894513faa07" containerID="174fb4b763bb4f163bc85a04294013f98177580ffc902eddc561650221ac25ca" exitCode=0 Dec 03 23:18:18 crc kubenswrapper[4830]: I1203 23:18:18.918351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjgst/must-gather-n2ml9" event={"ID":"490687ea-7427-474e-9684-0894513faa07","Type":"ContainerDied","Data":"174fb4b763bb4f163bc85a04294013f98177580ffc902eddc561650221ac25ca"} Dec 03 23:18:18 crc kubenswrapper[4830]: I1203 23:18:18.919725 4830 scope.go:117] "RemoveContainer" containerID="174fb4b763bb4f163bc85a04294013f98177580ffc902eddc561650221ac25ca" Dec 03 23:18:19 crc kubenswrapper[4830]: I1203 23:18:19.043953 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjgst_must-gather-n2ml9_490687ea-7427-474e-9684-0894513faa07/gather/0.log" Dec 03 23:18:29 crc kubenswrapper[4830]: I1203 23:18:29.690681 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjgst/must-gather-n2ml9"] Dec 03 23:18:29 crc kubenswrapper[4830]: I1203 23:18:29.691445 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bjgst/must-gather-n2ml9" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="copy" containerID="cri-o://bd3fa853516a4ed5b396353cb681f5404865f957aebbb08a5ee849fe31ba19a3" gracePeriod=2 Dec 03 23:18:29 crc kubenswrapper[4830]: I1203 23:18:29.700897 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjgst/must-gather-n2ml9"] Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.026058 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjgst_must-gather-n2ml9_490687ea-7427-474e-9684-0894513faa07/copy/0.log" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.026819 4830 generic.go:334] "Generic (PLEG): container finished" podID="490687ea-7427-474e-9684-0894513faa07" containerID="bd3fa853516a4ed5b396353cb681f5404865f957aebbb08a5ee849fe31ba19a3" exitCode=143 Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.219492 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjgst_must-gather-n2ml9_490687ea-7427-474e-9684-0894513faa07/copy/0.log" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.223042 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.384427 4830 scope.go:117] "RemoveContainer" containerID="bd3fa853516a4ed5b396353cb681f5404865f957aebbb08a5ee849fe31ba19a3" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.397117 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9dfn\" (UniqueName: \"kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn\") pod \"490687ea-7427-474e-9684-0894513faa07\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.397330 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output\") pod \"490687ea-7427-474e-9684-0894513faa07\" (UID: \"490687ea-7427-474e-9684-0894513faa07\") " Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.404768 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn" (OuterVolumeSpecName: "kube-api-access-z9dfn") pod "490687ea-7427-474e-9684-0894513faa07" (UID: "490687ea-7427-474e-9684-0894513faa07"). InnerVolumeSpecName "kube-api-access-z9dfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.432248 4830 scope.go:117] "RemoveContainer" containerID="174fb4b763bb4f163bc85a04294013f98177580ffc902eddc561650221ac25ca" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.500612 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9dfn\" (UniqueName: \"kubernetes.io/projected/490687ea-7427-474e-9684-0894513faa07-kube-api-access-z9dfn\") on node \"crc\" DevicePath \"\"" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.592304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "490687ea-7427-474e-9684-0894513faa07" (UID: "490687ea-7427-474e-9684-0894513faa07"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:18:30 crc kubenswrapper[4830]: I1203 23:18:30.602231 4830 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/490687ea-7427-474e-9684-0894513faa07-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 23:18:31 crc kubenswrapper[4830]: I1203 23:18:31.035075 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjgst/must-gather-n2ml9" Dec 03 23:18:31 crc kubenswrapper[4830]: I1203 23:18:31.349690 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490687ea-7427-474e-9684-0894513faa07" path="/var/lib/kubelet/pods/490687ea-7427-474e-9684-0894513faa07/volumes" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.078796 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:41 crc kubenswrapper[4830]: E1203 23:18:41.079709 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="registry-server" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.079721 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="registry-server" Dec 03 23:18:41 crc kubenswrapper[4830]: E1203 23:18:41.079741 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="extract-utilities" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.079749 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="extract-utilities" Dec 03 23:18:41 crc kubenswrapper[4830]: E1203 23:18:41.079762 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="extract-content" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.079769 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="extract-content" Dec 03 23:18:41 crc kubenswrapper[4830]: E1203 23:18:41.079786 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="gather" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.079792 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="gather" Dec 03 23:18:41 crc kubenswrapper[4830]: E1203 23:18:41.079815 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="copy" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.079820 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="copy" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.080075 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="gather" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.080092 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="510bea8f-d0af-4eeb-bb35-03b994c50e02" containerName="registry-server" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.080105 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="490687ea-7427-474e-9684-0894513faa07" containerName="copy" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.081901 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.103321 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.176167 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.176340 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.176422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdct\" (UniqueName: \"kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.278099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.278173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.278226 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdct\" (UniqueName: \"kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.278666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.278986 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.298464 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdct\" (UniqueName: \"kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct\") pod \"redhat-marketplace-sv9zt\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.409396 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:41 crc kubenswrapper[4830]: I1203 23:18:41.933347 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:43 crc kubenswrapper[4830]: I1203 23:18:43.165661 4830 generic.go:334] "Generic (PLEG): container finished" podID="9499f362-60cd-4816-aa75-44f875a0f6b3" containerID="d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae" exitCode=0 Dec 03 23:18:43 crc kubenswrapper[4830]: I1203 23:18:43.165910 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerDied","Data":"d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae"} Dec 03 23:18:43 crc kubenswrapper[4830]: I1203 23:18:43.166034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerStarted","Data":"326941830d1e91311f85ab7c2a17d3eab61d34cd75ea47269fa8476b40836808"} Dec 03 23:18:43 crc kubenswrapper[4830]: I1203 23:18:43.168181 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:18:45 crc kubenswrapper[4830]: I1203 23:18:45.185941 4830 generic.go:334] "Generic (PLEG): container finished" podID="9499f362-60cd-4816-aa75-44f875a0f6b3" containerID="15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7" exitCode=0 Dec 03 23:18:45 crc kubenswrapper[4830]: I1203 23:18:45.186149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerDied","Data":"15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7"} Dec 03 23:18:46 crc kubenswrapper[4830]: I1203 23:18:46.195861 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerStarted","Data":"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a"} Dec 03 23:18:46 crc kubenswrapper[4830]: I1203 23:18:46.214457 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sv9zt" podStartSLOduration=2.805613582 podStartE2EDuration="5.214437964s" podCreationTimestamp="2025-12-03 23:18:41 +0000 UTC" firstStartedPulling="2025-12-03 23:18:43.167992441 +0000 UTC m=+4412.164453790" lastFinishedPulling="2025-12-03 23:18:45.576816823 +0000 UTC m=+4414.573278172" observedRunningTime="2025-12-03 23:18:46.212030468 +0000 UTC m=+4415.208491817" watchObservedRunningTime="2025-12-03 23:18:46.214437964 +0000 UTC m=+4415.210899323" Dec 03 23:18:51 crc kubenswrapper[4830]: I1203 23:18:51.409954 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:51 crc kubenswrapper[4830]: I1203 23:18:51.410604 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:51 crc kubenswrapper[4830]: I1203 23:18:51.463351 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:52 crc kubenswrapper[4830]: I1203 23:18:52.300920 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:52 crc kubenswrapper[4830]: I1203 23:18:52.368414 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:54 crc kubenswrapper[4830]: I1203 23:18:54.274057 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sv9zt" podUID="9499f362-60cd-4816-aa75-44f875a0f6b3" containerName="registry-server" containerID="cri-o://2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a" gracePeriod=2 Dec 03 23:18:54 crc kubenswrapper[4830]: I1203 23:18:54.975260 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.064260 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdct\" (UniqueName: \"kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct\") pod \"9499f362-60cd-4816-aa75-44f875a0f6b3\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.064446 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities\") pod \"9499f362-60cd-4816-aa75-44f875a0f6b3\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.064475 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content\") pod \"9499f362-60cd-4816-aa75-44f875a0f6b3\" (UID: \"9499f362-60cd-4816-aa75-44f875a0f6b3\") " Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.065396 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities" (OuterVolumeSpecName: "utilities") pod "9499f362-60cd-4816-aa75-44f875a0f6b3" (UID: "9499f362-60cd-4816-aa75-44f875a0f6b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.070112 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct" (OuterVolumeSpecName: "kube-api-access-csdct") pod "9499f362-60cd-4816-aa75-44f875a0f6b3" (UID: "9499f362-60cd-4816-aa75-44f875a0f6b3"). InnerVolumeSpecName "kube-api-access-csdct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.082644 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9499f362-60cd-4816-aa75-44f875a0f6b3" (UID: "9499f362-60cd-4816-aa75-44f875a0f6b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.167551 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdct\" (UniqueName: \"kubernetes.io/projected/9499f362-60cd-4816-aa75-44f875a0f6b3-kube-api-access-csdct\") on node \"crc\" DevicePath \"\"" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.167582 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.167591 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9499f362-60cd-4816-aa75-44f875a0f6b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.286856 4830 generic.go:334] "Generic (PLEG): container finished" podID="9499f362-60cd-4816-aa75-44f875a0f6b3" containerID="2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a" exitCode=0 Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.286976 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sv9zt" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.286958 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerDied","Data":"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a"} Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.287480 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sv9zt" event={"ID":"9499f362-60cd-4816-aa75-44f875a0f6b3","Type":"ContainerDied","Data":"326941830d1e91311f85ab7c2a17d3eab61d34cd75ea47269fa8476b40836808"} Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.287521 4830 scope.go:117] "RemoveContainer" containerID="2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.313147 4830 scope.go:117] "RemoveContainer" containerID="15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.332345 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.341431 4830 scope.go:117] "RemoveContainer" containerID="d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.350549 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sv9zt"] Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.389020 4830 scope.go:117] "RemoveContainer" containerID="2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a" Dec 03 23:18:55 crc kubenswrapper[4830]: E1203 23:18:55.389457 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a\": container with ID starting with 2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a not found: ID does not exist" containerID="2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.389520 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a"} err="failed to get container status \"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a\": rpc error: code = NotFound desc = could not find container \"2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a\": container with ID starting with 2d5b6cdc74009befd19dceaa280e955b385566a673244b02fd29c107aa51ed6a not found: ID does not exist" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.389555 4830 scope.go:117] "RemoveContainer" containerID="15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7" Dec 03 23:18:55 crc kubenswrapper[4830]: E1203 23:18:55.390038 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7\": container with ID starting with 15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7 not found: ID does not exist" containerID="15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.390076 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7"} err="failed to get container status \"15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7\": rpc error: code = NotFound desc = could not find container \"15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7\": container with ID starting with 15994083169ef20c24cc87c6d6527296130a368b13f63a91dbb6d85e5e5051f7 not found: ID does not exist" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.390105 4830 scope.go:117] "RemoveContainer" containerID="d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae" Dec 03 23:18:55 crc kubenswrapper[4830]: E1203 23:18:55.390378 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae\": container with ID starting with d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae not found: ID does not exist" containerID="d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae" Dec 03 23:18:55 crc kubenswrapper[4830]: I1203 23:18:55.390400 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae"} err="failed to get container status \"d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae\": rpc error: code = NotFound desc = could not find container \"d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae\": container with ID starting with d98d2b8781c24b7f425fabcfd51850db666aeae01c8bec80d0db8471b06cf7ae not found: ID does not exist" Dec 03 23:18:57 crc kubenswrapper[4830]: I1203 23:18:57.347743 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9499f362-60cd-4816-aa75-44f875a0f6b3" path="/var/lib/kubelet/pods/9499f362-60cd-4816-aa75-44f875a0f6b3/volumes" Dec 03 23:19:56 crc kubenswrapper[4830]: I1203 23:19:56.681655 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:19:56 crc kubenswrapper[4830]: I1203 23:19:56.682297 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:20:26 crc kubenswrapper[4830]: I1203 23:20:26.681154 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:20:26 crc kubenswrapper[4830]: I1203 23:20:26.681799 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:20:56 crc kubenswrapper[4830]: I1203 23:20:56.681530 4830 patch_prober.go:28] interesting pod/machine-config-daemon-nfl7k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:20:56 crc kubenswrapper[4830]: I1203 23:20:56.682269 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:20:56 crc kubenswrapper[4830]: I1203 23:20:56.682318 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" Dec 03 23:20:56 crc kubenswrapper[4830]: I1203 23:20:56.683296 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042"} pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:20:56 crc kubenswrapper[4830]: I1203 23:20:56.683364 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerName="machine-config-daemon" containerID="cri-o://954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" gracePeriod=600 Dec 03 23:20:56 crc kubenswrapper[4830]: E1203 23:20:56.803982 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:20:57 crc kubenswrapper[4830]: I1203 23:20:57.411307 4830 generic.go:334] "Generic (PLEG): container finished" podID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" exitCode=0 Dec 03 23:20:57 crc kubenswrapper[4830]: I1203 23:20:57.411686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" event={"ID":"d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad","Type":"ContainerDied","Data":"954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042"} Dec 03 23:20:57 crc kubenswrapper[4830]: I1203 23:20:57.411724 4830 scope.go:117] "RemoveContainer" containerID="94831fe442d079e7d7e75327e587f8e56c3f30401cd93e0c187206047c524a0a" Dec 03 23:20:57 crc kubenswrapper[4830]: I1203 23:20:57.412571 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:20:57 crc kubenswrapper[4830]: E1203 23:20:57.412885 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:21:11 crc kubenswrapper[4830]: I1203 23:21:11.353021 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:21:11 crc kubenswrapper[4830]: E1203 23:21:11.354150 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:21:22 crc kubenswrapper[4830]: I1203 23:21:22.337770 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:21:22 crc kubenswrapper[4830]: E1203 23:21:22.338960 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:21:33 crc kubenswrapper[4830]: I1203 23:21:33.337232 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:21:33 crc kubenswrapper[4830]: E1203 23:21:33.338010 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:21:47 crc kubenswrapper[4830]: I1203 23:21:47.338340 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:21:47 crc kubenswrapper[4830]: E1203 23:21:47.339595 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:21:59 crc kubenswrapper[4830]: I1203 23:21:59.337988 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:21:59 crc kubenswrapper[4830]: E1203 23:21:59.339284 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:22:11 crc kubenswrapper[4830]: I1203 23:22:11.347466 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:22:11 crc kubenswrapper[4830]: E1203 23:22:11.348251 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:22:26 crc kubenswrapper[4830]: I1203 23:22:26.336981 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:22:26 crc kubenswrapper[4830]: E1203 23:22:26.338318 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad" Dec 03 23:22:41 crc kubenswrapper[4830]: I1203 23:22:41.343953 4830 scope.go:117] "RemoveContainer" containerID="954f5fdae4c3b30c22f88acbc11a67b21c7e5eec6f0374a73e02fb9bfcbae042" Dec 03 23:22:41 crc kubenswrapper[4830]: E1203 23:22:41.344794 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-nfl7k_openshift-machine-config-operator(d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad)\"" pod="openshift-machine-config-operator/machine-config-daemon-nfl7k" podUID="d0d9f02c-88ac-490a-8ec3-2f3fbb0ae3ad"